Welcome!

PHP Authors: Liz McMillan, Carmen Gonzalez, Hovhannes Avoyan, Lori MacVittie, Trevor Parsons

Related Topics: Microservices Expo

Microservices Expo: Article

Getting Greater Business Value from a Web ServiceThinking of bigger and wider opportunities

Getting Greater Business Value from a Web ServiceThinking of bigger and wider opportunities

Okay, so you've developed what you believe is a useful piece of code and exposed it to the world as a Web service. All comers may now rejoice in the warmth and glow of your artful coding. But who are the people and what are the business systems benefiting from your Web service? Are you reaching all the possible consumers of the service? What more would you have to do to make it as useful as possible to an even greater audience? And, to ask a simple but important question, do you even know exactly who's on the other end using the service?

The truth is, there's only so far you can go in order to make certain that the people you want to have access to your service are the ones actually using it. Thus begins the careful balancing act between concepts like functionality and performance...and the increasingly important issue of security. Wondering where this is going? Keep reading.

Moving Beyond Tic-Tac-Toe
Microsoft's .NET vision of XML Web services, together with the powerful set of tools MS provided for Web services deployment and use, helped create a new paradigm with tremendous impact. The ability to provide services between systems in the same way we use Web pages to deliver services creates value, as it gives us the opportunity to realize the promise of distributed computing among disparate platforms.

The current use of Web services focuses more on solving real business problems and less on performing the trivial functions initially used to illustrate the concept. (Although personally I still think playing tic-tac-toe via SOAP requests is a great way to experiment with new technology!) Businesses adopt Web services technology to generate revenue and lower costs. In doing so, they deal with the challenge of exposing their data as a service to a particular audience. When fulfilling that service, they decide what degree of security is needed so only that audience (and hopefully it's a paying one) has access to it.

Multiple Audiences, Multiple Security Levels
If you're reading this article you've probably already written a Web service or two, and you've probably also read plenty of white papers and articles on authentication and authorization options in Web services. The examples such articles use frequently imply that a Web service either has a particular level of security or it does not. This in turn suggests that a Web service has only a single audience, for whom that level of security is sufficient.

As developers, we want our code to be as useful as possible to the greatest possible number of users. Why? Well, job security may be one reason. More often it's pride and hubris (first noted as some of the virtues of a truly great programmer by Larry Wall and others in Programming Perl, a book now standard in the personal library of most developers). We love the idea that something we created is out there helping to make the world a better place.

But for us to keep playing with a new technology like XML Web services, somebody has to be footing the bill. You got it, business users! And what do they want? They want the greatest possible revenue for the least possible investment. Unfortunately, business users don't often believe that first-person shooter games played over the company network embody the greatest revenue opportunities.

So you need an approach to developing Web services that gives your code the greatest possible use and produces the greatest possible revenue. This means moving away from the idea of services that only meet the needs of a singular group of stakeholders. And this in turn means getting away from the idea that a service is either secured or it isn't. We need to think bigger and more broadly about what a Web service can do for the business.

Different Strokes for Different Folks
I was up late last night trying to think of some good examples to explain this approach. Okay, that's a lie. I was supposed to be up late working on this article but instead I was online checking the stats of my fantasy hockey team. It's annoying, because it takes forever to go through all the box scores and then calculate in my head how many points I've earned using our league's scoring system. Of course I could have just waited until the next morning when the fantasy site grabs the box scores from a major online news provider and updates the official league scoring. But that requires waiting and I hate waiting (impatience being another of the virtues of a programmer).

It then occurred to me there's an organization out there that controls the origination of that statistical data and publishes that data to subscribing sources. Who's interested in that data? Sports sites, portal/content providers, newspapers, and of course, fantasy hockey sites - to name just a few. Each one of these data consumers varies in its needs as to the timeliness of the data. Accordingly, each attaches a particular value to the data. The sports Web sites may want the data immediately with extensive detail, while the weekly newspaper in a small town may prefer consolidated information weekly.

The important point to consider here is that there's an initial owner of that data who then publishes it out to multiple consumers (and don't forget to consider both the potential internal and external consumers of the service). Honestly, I've no idea how this happens today, but the idea illustrates my point well. Data already exists that can be delivered to multiple stakeholders either for profit or, in other instances, as a cost-cutting measure.

The goal then is to create Web services that provide different levels of service or data to those specific consumers. Perhaps various rates are charged depending on the service. Perhaps consumers even get the lowest level of service free. To accomplish this, there are many security issues, such as limiting access according to service bought, while providing sufficient functionality and performance.

Figure 1 illustrates the initial model for this approach. If the hockey stats example doesn't seem relevant enough to you and your company, then let's consider other possibilities. How about a manufacturer that updates its production schedule every few hours based on actual orders received through trading partner Web sites and other sales order entry systems (see Figure 2)? While supply chain partners would benefit in receiving hourly updates on materials needed for production, other stakeholders - such as company managers and executives - may need only weekly roll-ups of the data, and this information could be supplied to them through personal portals on the intranet. Still other employees may be interested only in monthly roll-ups, just to give them an idea of how the company is doing. Additionally, investors or analysts may be allowed access only to quarterly roll-ups on a delayed basis. Get the idea? It's all about thinking bigger, in terms of all the different people benefiting from the service.

So how do you develop and implement a service to answer the broad range of needs of all stakeholders? The answer is not much different from the way you're already developing Web services. This article is aimed more at discussing the overall approach to planning and using Web services, and how business should think of them, than it is about a change in the way we actually develop.

A Working Example
For discussion purposes, let's walk through an example of how we might do this. Let's go back to the hockey game because, well, I like hockey.

We completed the first steps. We identified the data under our control that is useful and valuable to a number of consumers. Then, we identified the possible consumers and determined levels of value they might attach to the data and the probable details desired. And then we considered the degree of security required.

For our example, let's assume that, as games are played, the data is entered using an application at the stadium location and then submitted to a central data store that we control. Using VisualStudio.NET, we create a Visual Basic COM object that queries the data store and formats the data into an XML document every 60 seconds (including trimmed down versions at the various delayed levels) as a Web service. We'll name it Statistics Service, which is made available through an ISAPI Listener. The Microsoft development environment makes it easy for us to create test clients to be certain the service works as planned.

Authorization and Authentication
On receiving a request for the document, we identify the principal making the request, then determine the authorization level. Finally, we answer the request with the XML document the subscriber is authorized to receive. If the subscriber is one of the sports Web sites requesting up-to-the-minute information, we want to corroborate the subscriber, since a premium is paid for this data. But if the requestor is permitted access only to the most recent weekly roll-up, then corroboration isn't critical. In fact, if a request comes in and we can't identify the principal or for some reason authentication or authorization fails, we simply send the weekly roll-up - offering the lowest level of service as a default.

Let's start with the first piece - identifying the principal. The Web service should be designed to handle several security levels accordingly. Since this service will be available over the Internet we won't know the specific IP addresses of all of the consumers, so we want to leverage the authentication features of the protocol used to exchange messages, i.e. HTTP.

Microsoft Internet Information Server 5.0 (IIS) supports all of the various authentication mechanisms for HTTP shown in Table 1 (which is borrowed from MSDN).

Regardless of the security mechanism chosen at each level, we'll create Windows 2000 accounts for each consumer allowed to access the Web service and configure ACLs appropriately on the resources that make up the service. For default we also create an anonymous guest account that will have access to the lowest level of service.

For a nonpaying user there's no need for authentication - we'll allow a standard or guest login. Paying users will have their own unique usernames and passwords. For daily and up- to-the minute users there should be a more secure method that discourages hackers. This data is usually possible to obtain by alternative methods (e.g., one could view it free on a subscribing sports site or watch all the games on home satellite and write really, really fast). However, since some subscribers are paying a relative premium to get this data fed directly into their systems on request, it should be just difficult enough to make hacking it, or hijacking the session, not worth the trouble.

A reasonable mechanism in this case is to use hashing to transmit client credentials. Usernames and passwords should not be transported as plain text, so SSL would be used. But using SSL for document transmission would be detrimental to performance. One option to consider is splitting Statistics Service into two services, Stats Service (occurring over HTTP) and Logon Service (which precedes Stats Service for paying customers and will require SSL). Additional options for security mechanisms based on your specific business requirements can be found in the Global XML Web Services Specifications - which offer a security language for Web services. The specifications describe enhancements to SOAP messaging providing three capabilities: credential exchange, message integrity, and message confidentiality. These three mechanisms can be used independently or in combination to accommodate a wide variety of security models and encryption technologies.

Logon Service
In our example the client application begins by sending a logon request over SSL to the Logon Service that contains two parameters (username, password). The request is dispatched to a Logon COM object that accepts the username and password as input. They are authenticated against the known user accounts. The COM object then generates a hashed logon key (hashedkey) which is passed back to the client application and will expire in minutes or hours. The key is generated using Windows Cryptography APIs. The hashedkey becomes the first parameter of each request to Stats Service. The hashedkey will be used as an identifier to map to each user's authorization level according to their corresponding ACL so each one can then be served the corresponding XML document. If the username/password combination sent to Logon Service was the guest ID, then a hashedkey is still returned (always a standard and static hashedkey for guest). When the subsequent request comes to Stats Service, it's mapped to the guest access level. The Stats COM object will only grant access to the weekly roll-up of the statistics and dispatch the stripped-down XML document back to the client.

Statistics Service also needs to maintain some level of service in the event that authentication and authorization fail. This is done at two levels. The first potential failure is that Logon Service or Logon Object fails to find the username and password supplied. In Logon COM object, if it can't locate a non-guest username in the list of known users, it will by default return the guest hashedkey back to the client. This needs to be known by the client application if it's a paying customer and receiving limited "guest" information is not acceptable or even useful. Two parameters go back to the client application - hashedkey and accesslevel. The client application now makes the determination whether or not to place the subsequent request to Stats Service based on needs. The next consideration is if the Logon Service or Object failed altogether. In this instance, the client service receives no hashedkey. The client service submits that generic, static hashedkey that permanently maps to guest level authentication. (Someone wanting guest access would never even need to use Logon Service, but we'll anticipate the fact it may happen nonetheless.)

The description of these in and out parameters, as well as the static guest key, would be published through UDDI in the WSDL for the services. For a public service you may choose to publish it at http://uddi.org, but for internal or more private services the Microsoft platform makes this part even simpler since Windows.NET server (when released) will have UDDI built in.

So for Logon Service there are the following parameters:

Logon([in]username, [in]password, [out]hashedkey, [out]accesslevel)
And for Stats Service:
Stats([in]hashedkey, [out]statsdoc)
There's nothing groundbreaking about the coding methods here. What's different is the business approach to delivering the service of a single coding effort to the greatest possible number of consumers, despite varying needs surrounded by varying security levels.

There are further security measures surrounding the COM objects that you can take advantage of through using Common Language Runtime (CLR). However, this article isn't about security, you can find out more about that by studying it separately elsewhere. Even in a more common scenario like the manufacturing example, the goal is still to allow code written once to meet the needs of various stakeholders; and, for the default, to always offer the option for the lowest denominator of service. For highly secure data, a variation of this allows you to include a check for a known IP Address and/or Certificate from a given partner before you publish to them the requested XML document. Without that, it still offers the rolled-up delayed data.

It's a Web service for the masses, or at least for a whole lot more people than I might have tried to target previously. Now if only there was a Web service that could have accurately predicted that my star goalie would be out several weeks with a twisted ankle.

More Stories By Paul Hernacki

Paul Hernacki is a technical evangelist for Extreme Logic and is the bridge between the technical and business audiences. He articulates Extreme Logic’s solutions and explains how the Microsoft .NET platform delivers business value.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
DevOps at Cloud Expo – being held June 5-7, 2018, at the Javits Center in New York, NY – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Among the proven benefits,...
@DevOpsSummit at Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, is co-located with 22nd Cloud Expo | 1st DXWorld Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
SYS-CON Events announced today that T-Mobile exhibited at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. As America's Un-carrier, T-Mobile US, Inc., is redefining the way consumers and businesses buy wireless services through leading product and service innovation. The Company's advanced nationwide 4G LTE network delivers outstanding wireless experiences to 67.4 million customers who are unwilling to compromise on qua...
SYS-CON Events announced today that Cedexis will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Cedexis is the leader in data-driven enterprise global traffic management. Whether optimizing traffic through datacenters, clouds, CDNs, or any combination, Cedexis solutions drive quality and cost-effectiveness. For more information, please visit https://www.cedexis.com.
SYS-CON Events announced today that Google Cloud has been named “Keynote Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Companies come to Google Cloud to transform their businesses. Google Cloud’s comprehensive portfolio – from infrastructure to apps to devices – helps enterprises innovate faster, scale smarter, stay secure, and do more with data than ever before.
SYS-CON Events announced today that Vivint to exhibit at SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California. As a leading smart home technology provider, Vivint offers home security, energy management, home automation, local cloud storage, and high-speed Internet solutions to more than one million customers throughout the United States and Canada. The end result is a smart home solution that sav...
SYS-CON Events announced today that Opsani will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Opsani is the leading provider of deployment automation systems for running and scaling traditional enterprise applications on container infrastructure.
SYS-CON Events announced today that Nirmata will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Nirmata provides a comprehensive platform, for deploying, operating, and optimizing containerized applications across clouds, powered by Kubernetes. Nirmata empowers enterprise DevOps teams by fully automating the complex operations and management of application containers and its underlying ...
SYS-CON Events announced today that Opsani to exhibit at SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California. Opsani is creating the next generation of automated continuous deployment tools designed specifically for containers. How is continuous deployment different from continuous integration and continuous delivery? CI/CD tools provide build and test. Continuous Deployment is the means by which...