Friday, September 4, 2009

Microsoft .NET-managed Code Enablement: Examples and Challenges

Developing and deploying a Web service-connected information technology (IT) architecture is no small task. To that end, the Microsoft .NET Framework provides what a business might need: smart clients, servers to host Web services, the development tools and applications to create and use them, and a global network of over 35,000 Microsoft Certified Partner organizations to provide help for users.

Part Four of the series Subtle (or Not-so-subtle) Nuances of Microsoft .NET Enablement.

For a general discussion of the evolution of system architecture, see Architecture Evolution: From Mainframes to Service-oriented Architecture. For a definition of how the Microsoft .NET environment addresses the situation, see Subtle (or Not-so-subtle) Nuances of Microsoft .NET Enablement.

Example One: Intuitive Manufacturing Systems

The first example of a .NET-managed product is Intuitive Manufacturing Systems, a Kirkland, Washington (US)-based provider of extended enterprise resource planning (ERP) solutions for small and midsize discrete manufacturers (Intuitive Manufacturing Systems Shows Maturity in Adolescent Age). The company was recently acquired by ravenous (lately, anyway) fellow mid-market vendor Made2Manage Systems (see Made2Manage Systems One Year After: Reenergized and Growing). Intuitive's .NET technological prowess was cited as one of major attraction points, given that most of Made2Manage's ERP product lines were (at best) somewhere between the .NET-compatible and .NET-enabled evolutionary steps at the time.

Intuitive recently announced the milestone release of Intuitive ERP 8.0, which represents the completion of a major rewrite of Intuitive ERP functionality using .NET-managed code, which started a few years ago. With this release, all major manufacturing processes have been converted to the new architecture. Additionally, several areas of new functionality are now offered in Intuitive ERP 8.0, including new engineering change order (ECO) processes to support new product introduction (NPI) as well as engineering change requests (ECR) to facilitate getting improved product to market faster.

There are also approved supplier tools designed sppecifically for the growing contract manufacturing industry, which replace the commonly used but inefficient and mistake-prone spreadsheets. Last but not least, to support the demand-driven supply chain, material and capacity requirements planning runs now typically take minutes, eliminating those traditional long planning runs, and thus allowing on-demand planning. Version 8.0 was available to new customers in May 2006, and existing customers will be able to upgrade to Version 8.1, (scheduled for late 2006), when the migration tools should be available.

One should note that Web services are created naturally as a by-product of .NET-managed software, although they are also created naturally as a by-product of the Progress OpenEdge .NET support in, for example, Epicor Vantage (which has not been completely rewritten in pure .NET-managed code). To that end, Intuitive has componentized the business logic into granular .NET objects, whereby all transactions occur in extensible markup language (XML). This means, for one thing, that at Intuitive a Web service is different than elsewhere: many other mid-market vendors have chosen to add "wrappers" to whole legacy applications (such as customer resource management [CRM] or purchasing) , and advertise the ability of these applications to run on an ERP backbone as a composite application or service-oriented architecture (SOA). Some market research surveys show that although this may play well to complex and diverse tier one environments, the concept will not necessarily be embraced by mid-market manufacturers with more homogenous software platforms. Instead, Intuitive has worked hard to split up its applications into usable pieces of functionality that make business sense.

In fact, Web services is simply a "neat" technology until it is actually used. Applying this concept to the demand-driven supply chain, a real-world example of a granular business application is the available-to-promise (ATP) or capable-to-promise (CTP) Web service available in Intuitive ERP 8.0. To make it even more valuable, an Intuitive ERP user will be able to provide key customers with access to the ATP and CTP Web services through Microsoft Office Outlook (with the upcoming release of Microsoft Office 2007) for what-if planning scenarios, thus providing practical supply chain collaboration in real time. Supply chain partners will be able to make decisions quickly based on delivery dates and quantities from current production or stock (using ATP) and from new production plans (using CTP) without having to wait for a return call or e-mail.

Furthermore, the source of a transaction remains transparent in the innovative Intuitive Framework. In other words, whether the transaction comes from Intuitive ERP users who are interactively entering it on their computer, or from the outside world (as a Web service), the framework uses a single set of business logic. This should eliminate the gamut of problems that traditionally exist in other applications, where duplicate sets of code are required for different sources of transaction requests and data. Web services technology is still fairly young, and not as robust as it needs to be to fulfill its promise, and .NET-managed makes it easier to write, deploy, and consume Web services. Unfortunately, many of the other huge benefits of the .NET-managed environment (such as coherence of an integrated environment) are getting drowned out in all the ongoing hype surrounding Web services and SOA.

Another provider of an ERP solution that uses .NET Framework-based SOA and .NET-managed code is Andover, Massachusetts (US)-based Visibility Corporation. Since 1980, its suite VISIBILITY has been used by about 150 manufacturers of engineered products, and by other companies with project-oriented concerns. Now in its seventh generation, with the product dubbed VISIBILITY.net, the company elected to forego the use of wrappers to deliver .NET Framework-based functionality. To that end, the vendor has invested the last four years performing a complete conversion of the core client/server-based application to make use of a pure .NET-managed code architecture enabled via the use of Web services and Active Server Page (ASP).NET forms. The approach used here has provided clients with a true zero-footprint client for deployment, where no component other than a browser is required on the client workstation.

The benefits of the approach used in the VISIBILITY.net application are multiple, including a significant reduction in the amount of code required to deliver the more than 1,000 new distinct functions; a reported threefold to fourfold increase in transaction performance and associated scalability; and a reduction in the cost of deployment and management, as the application can be run by any client capable of running Microsoft Internet Explorer (IE) v5.5 SP2 or later as its browser. By abstracting the application model to make use of managed code and Web services, which distinctly deploy the form, business logic, and data connection layers, Visibility has reportedly gained ability in affecting database independence, improved run time performance, and application extensibility in relation to other applications which make use of a well-formed SOA.

Example Three: Epicor for Service Enterprises

The last example of a Microsoft-only stack product containing pure .NET-managed code and "militantly" componentized Web services, is Epicor for Service Enterprises, a brand new enterprise service automation (ESA) solution. This product aims at providing a single source for managing and automating most aspects of the project-focused organization. The product is written completely in .NET-managed code, and on the very latest Microsoft .NET Framework 2.0, Microsoft SQL Server 2005, VS.NET 2005, and Web services. To be precise, the latest version (8.1.1), which became generally available just a few weeks ago, runs on SQL Server 2005, .NET Framework 1.1 and VS.NET 2003. Certification for the move to .NET 2.0 and VS.NET 2005 is in progress, and is expected to become available in the next few months along with Microsoft Project 2007 support as part of Epicor's commitment to support the latest Microsoft stack at all times. In any case, this application did take several years to write from scratch (the initial release was in June 2003, and currently has more than 70 customers and 25,000 seats) and, contrary to its brethren within Epicor, is limited to only Microsoft technology because of the approach—but it also has the benefits of .NET as mentioned above.

Furthermore, the product is backed up by the Epicor Internet Component Environment (ICE), which is a standards-based framework written with Microsoft VS.NET and running on top of the Microsoft .NET Framework. It offers an application development environment (customization and extensibility tools for assembly, deployment, execution, and maintenance of applications) with a feature-rich (albeit thin client) user interface (UI), and pure web access to clients. Using Web services for nearly all application logic, Epicor ICE provides a detachable and vastly configurable UI that is simple to deploy and easy to maintain.

Examples of Microsoft .NET Enablement

The Microsoft .NET environment includes what a business might need to develop and deploy a Web service-connected information technology (IT) architecture: smart clients, servers to host Web services, development tools to create them, applications to use them, and a worldwide network of more than 35,000 Microsoft Certified Partner organizations to provide any help users might need.

Part Two of the series Subtle (or Not-so-subtle) Nuances of Microsoft .NET Enablement.

Most vendors have naturally chosen to evolve their existing application framework to meet the market needs detailed in Subtle (or Not-so-subtle) Nuances of Microsoft .NET Enablement. For a general discussion of the evolution of system architecture, see Architecture Evolution: Service-oriented Architecture versus Web Services.

When a compelling new technology does appear, it is quite common in the industry for an enterprise application provider to surround its old enterprise resource planning (ERP) or accounting core software in a "wrapper" of newer technology. The purpose of this is to effectively obfuscate the old technology, giving it the latest graphical look, or providing an easier means to access the core business logic and data from other, more modern systems and devices, or the Internet. Many ERP and accounting back-office systems in the market today were originally written in—and still contain—cores written in non-mainstream, or even antiquated technologies. Strategies employed to wrap older products include putting contemporary Windows graphical user interfaces (GUIs) (often referred to as "screen scrapers") or web browser-based user interfaces (UIs) on them. Lately, strategies have included providing new Web services layers to rejuvenate aged products by accessing the old business logic components and databases.

Evolving means a slower process where incremental changes are made to the existing architecture so that it eventually meets these demands. There are some good examples of .NET-enabled legacy software systems to which wrappers have been added to allow legacy functionality to be used and extended through Web services on the .NET Framework. In other words, at this more advanced level of .NET readiness, the legacy software system has a wrapper added which is a communication component created by an additional layer of code in the product. The wrapper is written in one of the .NET Framework languages, and by adding this wrapper, the legacy system functionality can be used through Web services. Other great advantages of this approach is that such systems run on the accepted current market definition of the .NET Framework, and allow fairly rapid enablement of legacy functionality.

One application that fits into the wrappering side of things is the Epicor Enterprise client/server product suite, whose business logic is exposed via .NET Web services. Most Epicor products generate Web services from business logic, or have business logic that simply is Web services (since the vendor has some heritage manufacturing products that will remain in traditional client/server mode). It is also interesting to note that there are no Web services in many of the Microsoft Dynamics ERP products as of yet. This is primarily because in some products, nothing is coded in (or requires) .NET for now, at least not in the core product (see Microsoft Keeps on Rounding up Its Business Solutions). Certainly, developers have an application program interface (API) to code in .NET around them or to extend them, but the core product is still largely Common Object Model (COM)-based. Even Epicor Enterprise might be in "better shape," since Epicor provides .NET-enablement via .NET extensible markup language (XML) Web service code wrappers.

A great example of a more advanced approach is that of SYSPRO. It goes beyond using wrappers, and aims to componentize the product so that its functionality can be used on any device or with any modern development language (including .NET languages). SYSPRO is a well-known developer of enterprise software for mid-market manufacturers and distributors (with about 12,000 licensed companies in more than 60 countries worldwide), and was one of the first software vendors to embrace the Microsoft .NET technology (see SYSPRO—Awaiting Positive IMPACT From Its Brand Unification). SYSPRO spent years developing its .NET Framework-based solution during the same time period of Microsoft's efforts to launch the .NET Framework technology commercially. Many of the building blocks of the SYSPRO solution were built initially on the beta releases of the commercially available Microsoft software. The company saw the .NET Framework as a way to add functionality and extend controls along the entire supply chain control, without the need for extensive programming or alterations to the core system.

SYSPRO introduced SYSPRO e.net solutions to expose the extensive SYSPRO functionality as business objects that can be used on any device or with any modern development language. This "componentization" was written from the ground up, to work seamlessly with XML, and .NET or COM environments. The SYSPRO business objects or components are "building blocks" that allow customers and developers to build Web services for customized solutions, or for seamless integration into third party products relatively quickly and easily. The business objects ensure that business logic, SYSPRO security, and data integrity are retained.

SYSPRO e.net solutions also form the foundation of the rewritten SYSPRO web applications that were developed using .NET technology and the SYSPRO business objects. The SYSPRO e.net solutions Web services, which form part of the web applications, deliver the core SYSPRO functionality to almost any client device across a variety of protocols in an integrated and cohesive manner. This enables applications and services that should provide manufacturers and distributors with new levels of functionality and flexibility.

A good example of the use of SYSPRO e.net solutions is the SYSPRO CyberStore offering, which is an e-commerce application that extends the concept of service-oriented architecture (SOA) to business-to-business (B2B) and business-to-consumer (B2C) trading. SYSPRO CyberStore offers online shopping 24x7 with near real-time inventory information and real-time pricing, and places the order directly into the SYSPRO ERP system using SYSPRO business objects and XML standards. For example, as a user navigates through the e-commerce site, different SYSPRO e-net solutions objects and services will be invoked to retrieve the relevant information. As a product is selected by a buyer, the inventory look-up business object is invoked to perform an online inventory check and fetch the latest image of the product from the back-end SYSPRO ERP system. The information is returned and rendered to the user, with the result being live inventory information provided to a potential e-commerce buyer. In addition, if the refresh button is selected a split second after a sales order is entered by the accounts department in the back-end ERP system, the revised inventory information will be displayed to the user on the e-commerce site. If the user purchases the item, the information relevant to the buyer and the payment method will be collected in the front-end e-commerce system, passed to a business object using the XML standard, and automatically processed in real time into the back-end SYSPRO ERP system.

SYSPRO e.net solutions provide a fairly cost-effective way for SYSPRO customers to integrate other best-of-breed applications, maximize B2B e-commerce trading, and leverage wireless connectivity, without compromising the business rules and security inherent in SYSPRO software. The XML standard and collaborative commerce tools like the Microsoft BizTalk Server and SYSPRO e.net solutions Document Flow Manager (DFM) enable systems to be more extensible and to collaborate with any other disparate or legacy system. As a result of the effective use of the .NET Framework, objects and services, and XML, independent systems can be set up to collaborate in real time despite their disparities.

In order to make the technology viable in the mid-market, SYSPRO embedded the aforementioned collaborative commerce engine into the core SYSPRO ERP software. The DFM automatically consumes and transmits XML transactions in real time by continually checking predetermined folders or e-mail addresses on a Microsoft Exchange Server for XML transactions. The XML transaction files are either e-mailed or transmitted via file transfer protocol (FTP). As the SYSPRO predefined XML transaction is identified by the DFM, it is automatically consumed by the module, which will in turn invoke a business object (business logic) to process the received transaction. The DFM module can also be configured to transmit the reply from the business object to an e-mail address or to another business object for further processing.

In environments where it may not be feasible to transact in real time directly through a Web service, the DFM module can be used to asynchronously process the transaction. The payment authorization can be processed in the front-end e-commerce system, and the relevant information placed in an XML file, which would then be transmitted back to the DFM and consumed using the same business object as if it were being processed using the Web service. In a situation where the back-end SYSPRO ERP system is offline for some reason, transactions are queued for processing by the DFM, and the module can initiate replies and transmit XML transactions back to the e-commerce system, and initiate e-mails or other proactive processes to increase efficiencies and the customer experience.

SYSPRO Expands Strategy

SYSPRO continues to expand its .NET Framework strategy with its latest product release, SYSPRO Version 6.0 Issue 010. The brand new SYSPRO Reporting Services (SRS) suite is written using .NET technology, and uses the business objects to render the reports seamlessly to an embedded Crystal Reports XI Server. SRS offers additional functionality such as archiving, scheduling, report customization, and various output methods. Issue 010 also sees the release of a new SYSPRO Analytics module (totally rewritten in .NET) which provides a solution for analyzing and dissecting data, enabling businesses to track trends, recognize and adapt to changes, and make informed decisions.

SYSPRO delivers information to users with a sophisticated analytics tool that is fairly easy to use, and does not require the technical knowledge of an online analytical processing (OLAP) developer. The new UI in Issue 010 has also been rewritten using cutting-edge GUI components, offering SYSPRO users easy personalization of their own screens, as well as easy customization enabling the use of Web services with VBScript associated with an unlimited number of user-defined fields. The customization can be deployed on a central or distributed basis. Electronic Signatures, also released in Issue 010, enables much more than just operator authentication when transactions are processed. The flexible design enables processes to be triggered based on user-defined criteria, facilitating enhanced business process controls that link to third party or custom programs, or to Web services with VBScript.

SYSPRO e.net solutions continue to provide a solid foundation that enables businesses to build or develop a service-oriented architecture (SOA). SOA concepts can simplify the reengineering of business processes, and provide a solid and obvious foundation for companies to respond more quickly to change. Business benefits from SOA should also include improved time-to-market, more responsive customer service, and increased visibility in the face of changing regulations, such as the US Sarbanes-Oxley Act (SOX) (see Using Business Intelligence Infrastructure to Ensure Compliancy with the Sarbanes-Oxley Act) and US Food and Drug Administration (FDA) requirements.

Epicor Software Corporation, also a prominent mid-market provider of industry-specific enterprise software solutions, is an example of a vendor that used Microsoft .NET technology to rebuild its application from the ground up as an SOA too. Epicor delivered its next-generation SOA-based manufacturing solutions, Vantage 8.0 and Vista 8.0, at the end of 2004. The solutions continue to be adopted by progressive midsized manufacturers, with the re-architected Vantage and Vista solutions having reportedly been shipped to more than 1,000 new and existing Epicor customers. Epicor Vantage is a comprehensive solution designed to meet the needs of make-to-order (MTO) and mixed-mode manufacturing companies, while Epicor Vista is an integrated manufacturing and accounting solution for emerging manufacturers, job shops, or MTO departments of large enterprises. Designed to meet the needs of both small and medium manufacturers, the solutions include built-in workflows managing the entire order cycle, from marketing and sales, through production and planning, sourcing and procurement, installation and service, and financial recognition. Early in 2002, Epicor embarked on a major effort to rebuild the two products using Microsoft .NET and the Progress OpenEdge fourth-generation language (4GL) development environment, which has meanwhile become.NET Framework compliant (for more information, see Epicor Reaches Better Vista From This Vantage Point). Both Vantage and Vista are available on Microsoft SQL Server and Progress databases, giving customers the freedom and flexibility to choose whichever makes sense for their business.

As a result of a major effort over a few years (about one year to develop the SOA tool set, and the rest to port the code), the new object-oriented SOA-based ERP systems were announced in late 2004. Featuring an n-tier architecture built with Microsoft .NET and Web services technology, both Vista 8.0 and Vantage 8.0 are architected from the ground up to support SOA, which should enable user businesses to leverage software services and components through open industry standards. In turn, this should simplify application-to-application (A2A) integration and supply chain connectivity. The new architecture exposes all functionality as Web services, which should make it easier to orchestrate business processes and workflows within the application in order to promote lean principles and continuous performance initiatives. It also promises new levels of application reliability, scalability, system interoperability, and flexibility, combined with a rich and personalized user experience.

Today, nearly 500 business objects across 30 modules (featuring thousands of business functions) are exposed as business services, meaning that customers should be able to extend the applications and develop integrations to other products. To that end, given the somewhat heterogeneous portfolio of Epicor products, the vendor announced Epicor Service Connect (released in fall 2005), a Web services-based business integration platform. This platform functions as the central integration point for implementing secure workflow orchestrations within Epicor applications and with third party applications to enhance collaboration and automate business processes. Harnessing the openness of XML Web services, Service Connect uses industry-wide standards and technology enabling businesses to deploy solutions now, with a degree of confidence that their investment will remain intact in the future. The SOA of many Epicor solutions enables Service Connect to transform or combine application processes to streamline processing within the application framework, whereby business components, represented as Web services outside of the application, can easily be accessed within Service Connect to eliminate non-value added steps and streamline basically any business process.

Designed to support both internal and external connectivity to Epicor solutions as well as to external applications or processes, Service Connect provides a straightforward solution for graphically mapping process flows and orchestrating transsactions. To that end, the tool uses visual workflows to map data to different formats, and create and assign tasks for human interaction, and uses "drag and drop" processes to call Web services, enabling non-programmers to build their own scenarios that interoperate with the application. For example, processing sales orders typically involves multiple steps including numerous availability inquiries, reviews, and inventory release decisions, all serving to extend order lead times. Service Connect enables users to eliminate many of these steps by creating orchestrations for routing processes to automated tasks, such as order-submit direct-to-pick for specific inventory items, or priority order fulfillment for a company's best customers, in turn improving order-to-delivery performance. By enabling non-programmer solution users to automate tasks and processes within the application, Service Connect also helps to promote lean principles, continuous performance initiatives, and Six Sigma quality, by providing a simple workflow orchestration tool to improve collaboration, velocity of information, and ultimately value chain performance.

Subtle (or Not-so-subtle) Nuances of Microsoft .NET Enablement

Microsoft .NET is a comprehensive software development environment that was introduced in 2000 as Microsoft's next-generation programming environment. Pronounced "dot net," and widely known as the Microsoft .NET Framework, it was designed to compete with the counterpart Java-based Java 2 Enterprise Edition (J2EE) platform. The .NET Framework is the Microsoft Web services strategy for connecting information, people, systems, and devices through software, with the promise of "information anytime, anywhere, on any device." Integrated across the Microsoft platform, .NET Framework-based technology provides the ability to more quickly build, deploy, manage, and use connected, security-enhanced solutions with Web services.

Part One of the series Subtle (or Not-so-subtle) Nuances of Microsoft .NET Enablement.

The Microsoft .NET environment includes what a business might need to develop and deploy a Web service-connected information technology (IT) architecture: smart clients, servers to host Web services, development tools to create them, applications to use them, and a worldwide network of more than 35,000 Microsoft Certified Partner organizations to provide any help users might need.

The Microsoft .NET Framework is an integral Microsoft Windows component for building and running the "next generation" of applications and extensible markup language (XML)-based Web services. Among the potential benefits of the .NET Framework-based technology is the ability to provide a productive, standards-based, industrial strength, enterprise-ready, multilanguage environment that simplifies application development. This should enable developers to make use of their existing skill sets, facilitate integration with existing software, and ease the challenges of deploying and operating Internet-scale applications. The .NET Framework is the infrastructure of the .NET platform, which includes the Common Language Runtime (CLR) and the .NET Framework class library. The CLR provides the environment for running .NET Framework-based applications, whereas the class library provides the foundation services, including Active Server Page (ASP).NET; ActiveX Data Objects (ADO).NET; WinForms (for building graphical user interfaces [GUIs]); and base class libraries for accessing Common Object Model (COM) services.

Programmers can chose from several different programming languages, such as Microsoft C# (C Sharp), Visual Basic .NET (VB.NET), J# (J Sharp), Managed C++, JScript.NET, and others. The European Computer Manufacturers Association (ECMA) has standardized .NET as the Common Language Infrastructure (CLI), and numerous other languages have been reengineered as CLI languages. ECMA also standardized the C# programming language, designed by Microsoft to be the flagship .NET Framework-based language.

Depending on the class libraries used, the output of .NET and CLI compilers may or may not be identical, since .NET compilers generate Microsoft Intermediate Language (MSIL) bytecode, and CLI compilers generate Common Intermediate Language (CIL) bytecode. MSIL is executed by the CLR, and CIL bytecode is executed by the Virtual Execution System (VES). Both the CLR and VES are run-time engines like the Java Virtual Machine (JVM) in Java, since they provide a fundamental set of services that all programs use. The difference is that Java bytecode can also be interpreted as well as compiled, but the JVM supports only Java, and not multiple programming languages.

As mentioned earlier on, the heart of both .NET and CLI is a cross-platform language system. Although similar to Java because it uses an intermediate bytecode language that can be executed on any hardware platform that has a run-time engine, it is also unlike Java, as it provides support for multiple programming languages.

Currently in a beta release, the Microsoft .NET Framework 3.0 (formerly called WinFX) includes a new set of managed code application program interfaces (APIs) that are an integral part of the upcoming Windows Vista and Windows Server "Longhorn" operating systems. It will also be available for Windows XP SP2 and Windows Server 2003. The Microsoft .NET Framework 3.0 includes version 2.0 of the CLR, and it consists of four major components:

1. Windows Presentation Foundation (WPF) (formerly code-named Avalon), a new user interface (UI) subsystem which is API-based on XML and vector graphics (it will make use of three-dimensional [3D] computer graphics hardware and Direct3D technologies);
2. Windows Communication Foundation (WCF) (formerly code-named Indigo), a service-oriented messaging system that allows programs to interoperate locally or remotely similar to Web services;
3. Windows Workflow Foundation (WF), which allows for building of task automation and integrated transactions using workflows; and
4. Windows CardSpace (WCS) (formerly code-named InfoCard), a software component that securely stores digital identities of a person, and provides a unified interface for choosing the identity for a particular transaction, such as logging in to a web site.


While more technical details on Microsoft's ever-morphing technology blueprint can be seen in What Do Users Want and Need?, a key aim of .NET is interoperability between systems, both internal and external. The framework uses Web services and componentized systems as building blocks to create more collaborative systems. A resulting enterprise system is componentized by creating business objects that can be independently accessed to perform specific business functions and processes. The .NET Framework uses the XML standard as its "glue" for transferring data between objects, and in and out of the core system, and it complies with the concept and freedom of a browser and Web services as a means of rendering the information to a user. The business object should act as a "gatekeeper" to the system by ensuring that the following three fundamentals remain intact:

1. Security is enforced by the mere fact that every time an object is accessed the user is authenticated, and the security level prescribed by the core application is adhered to.
2. The business logic of the underlying application is always protected, whereby parameters are simply passed to the object for processing. The object protects the underlying business logic, and processes the transaction based on the passed parameters as if a user was sitting at a client workstation and entering the transaction.
3. The underlying data integrity is always protected, as the raw data is never accessed, since all data manipulation is controlled by the protected business object. The integrity of the underlying system is kept intact at all times, while at the same time an environment is created to extend functionality with a minimum amount of time, cost, and expertise.

Thus, with the advent of .NET, Microsoft-centric users might have the "best of both worlds," as they can benefit immensely from a feature-rich core system and have the added advantage of being able to develop business-specific applications to extend the functionality around the core system. This comes without the concern that future upgrades of the core system might affect or break the business-specific application.

Yet the Microsoft .NET strategy continues to confuse many users and vendors, due to the lack of understanding surrounding the technology. Indeed, because of the massive marketing campaign undertaken by Microsoft on the benefits of its .NET Framework-based technology, many vendors have adopted a "too liberal" approach to marketing their .NET Framework-based initiatives. The fact is, as soon as a software product is enhanced to consume or emit XML, it is called a .NET Framework-based product. In an effort to have their offerings perceived as ".NET-enabled," numerous vendors are referring to their solutions as such, though their products fall short of fulfilling many of the Microsoft-defined .NET parameters, some of which were outlined earlier on.

Consider the case where the body of software code comprising the core of the enterprise system has already been written. This code encompasses the business logic—the vast collection of rules that define required business transactions, and the rules and conventions for ensuring data accuracy, integrity and completeness, and appropriateness. The vendor is naturally reluctant to rewrite that core (which was difficult to write and maintain in the first place) in a new language, or to make the major structural changes necessary to employ a newer, more powerful database or operating system (OS) platform technology. Analyzing the current state of affairs of .NET readiness amongst the independent software vendors (ISVs), the most basic categorization (but not necessarily the most prevalent) is the case of mere .NET compatibility. This means that legacy software simply runs on .NET-branded servers (Microsoft Windows). On a positive note, these Microsoft-centric vendors can run on the latest Microsoft OS and database platforms. But on the downside, the well-publicized benefits of Web services are possibly not easily achievable, although these are often the first things companies want to integrate.

Sizing the Enterprise Incentive Management Opportunity—And the Challenges Ahead

It's no wonder that in mid-2006, the results of a enterprise incentive management (EIM) market sizing study conducted by Evalueserve showed that the overall worldwide market for EIM solutions was estimated at $3 billion (USD) in 2005, although one has to note that this market mainly consisted of homegrown solutions (91 percent). The global third party packaged software market for EIM solutions is expected to grow at a compound annual growth rate (CAGR) of 30 percent, from $275 million (USD) in 2005 to $1 billion (USD) by 2010. The research suggests that pure-play EIM vendors which have focused on providing the capability to manage highly complex compensation systems will be well positioned to take advantage of the major growth projected in the EIM market. Indeed, prospects for this market outstrip those of other enterprise applications markets, such as customer relationship management (CRM) and enterprise resource planning (ERP).

Part Five of the series Thou Shalt Motivate and Reward Workforce Better.

For background information on incentives and compensation, see Thou Shalt Motivate and Reward Workforce Better, Are Sales Incentives Even in tune with the Corporate Strategy?, What Makes Incentives and Compensation So Tricky?, and Enter Enterprise Incentive Management and Incentive Compensation Management.

Evalueserve has examined EIM market growth rates by geography and industry, as well as industry fit. Highlights from the research of industry-specific projections show impressive growth rates and opportunities for leading industries using EIM solutions, but also show the need for EIM providers to show industry savvy. The industries examined in this note are insurance, retail banking, retail, high tech, telecommunications, and life sciences.

EIM For Insurance, Retail Banking, and Retail There is a 34 percent growth prediction in the insurance industry, where EIM solutions are primarily driven by bonus and commission payouts, but there is also a projected increase of insurance sales agents through 2014 that should boost EIM expenditures. Distribution channel management has become a key differentiator for insurance companies in the current difficult investment and claims environment.

Namely, the insurance industry has become increasingly focused on brokers and alternative distribution channels over the past twenty years, leading to complex and often convoluted distribution chain relationships. Insurers require channel flexibility and support for an ever-widening portfolio of products to meet broker and consumer expectations, while concurrently demanding process efficiency, data accuracy and transparency, and information technology (IT) cost savings. Recently, insurer�broker relationships have come under close regulatory scrutiny, requiring insurance companies to provide detailed information about broker behavior and compensation. Unfortunately, most insurance companies are unable to show a consistent and consolidated view of the insurer�broker relationship, due to the multiple roles brokers play in the insurance industry, and the legacy technology used to manage them.

As regulators continue to examine the insurer�broker relationship, consolidated distribution information transparency becomes critically important, while the nature of the insurer�broker relationship will likely evolve rapidly over the next several years. Insurance companies are thus expected to implement enterprise-level processes and technology to manage distribution information and incentives, thereby ensuring proper compliance with existing and new industry regulations. They seem to have several main business challenges:

* managing a shifting portfolio of traditional and alternative distribution channels covering a variety of markets;
* driving cross-selling initiatives within the existing customer base through both existing channels and multi-channel team structures;
* satisfying an increasingly stringent financial regulatory environment through accurate payments, transparent information, and clear, auditable processes; and
* reducing the operational and IT costs associated with distribution management, and reducing the risks associated with outdated, inaccurate and inflexible systems.

Consequently, an EIM package for insurance must be designed to meet the needs of today's insurance distribution management, supporting complex distribution channel hierarchies and the expansion of existing distribution channels, and more roles within in each channel (meaning brokers and consultants). Such a package must also support flexible, effective-dated compensation plans; detailed information about all incentive and fee-based payments; and multi-tiered compensation plans, including commission, incentives and management and wholesaler overrides. It must help distribution management respond to changing channel and market demands quickly and efficiently through user-configurable compensation plans and web-enabled, secure compensation reports, while remaining compatible with both current and legacy insurance architectures, and enabling easy integration with multiple policy administration systems, as well as downstream financial systems.

This combination of controlled processes (via auditable, accurate payment of incentives in compliance with established corporate guidelines); flexible modeling and implementation of plan changes (new incentive plans to meet the needs of a changing regulatory environment); and distribution information transparency, should help insurance companies meet the challenges of a rapidly changing insurance marketplace

Somewhat related to insurance (under the financial institutions segment), retail banking is widely regarded as the growth engine for the banking industry today, whereby branch offices are becoming more valuable as a prime face-to-face selling environment. In order to capitalize on this opportunity to maximize customer share of wallet (SOW), banks are paying incentive compensation to branch managers, tellers, and other customer-facing employees. In addition, mergers and acquisitions (M&As) offer their own challenges—as banks merge and offer a wider array of products and services to clients, the sales structure becomes more difficult to track, audit, and analyze. The pace of M&As within banking, and across all financial services businesses, demands a more focused view of sales strategies and their impact on corporate goals. Thus, a banking EIM package must help manage incentive compensation for branch and call center employees, successfully motivating them to raise the value of customer interactions. They must also integrate easily and quickly with other business process systems, smoothing the way for effective compensation plan management in a complex multi-product, multi-channel sales environment.

Evalueserve also reported a 26 percent expected growth in the retail industry for EIM solutions, owing to increasingly complex supply chains, a large number of transactions, and the growing number of retail salesperson jobs. Lately, big-box retailers such as Wal-Mart and Tesco have dramatically changed the business environment for all retail businesses. The ability of major chains to cut costs and improve distribution allows them to dominate multiple retail segments, making them a powerful competitive juggernaut that other retailers must contend with in the marketplace. In tandem with competitive pressures, retailers are also grappling with increasing rates of employee turnover—as high as 87 percent in some segments. At the same time, increased pressure to comply with labor regulations is complicating store operations for many retailers.

Retailers' common business challenges are to align store operations to achieve bottom-line results and improve store productivity, while concurrently focusing on customer experience and increasing transaction amounts. To that end, a retail-oriented EIM package has to help retail executives analyze real-time sales performance, and change compensation strategy in order to react quickly to changing market conditions. It should also automate many compensation management processes that drain away staff resources, allowing retail employees to focus more on selling, and less on administrative tasks. Forecasting tools also help sales executives perform what-if scenarios for labor dollars across the company.

The anticipated 25 percent growth in the high tech industry is supported by the implementation of EIM solutions in many companies, due to complex distribution structures and compensation plans, and growing sales forces. Thin profit margins, tough price competition, short product life cycles, and the need to optimize inventory require manufacturing companies to keep a close watch on ever-changing sales strategies and compensation expenses. In addition, frequent new-product deliveries and introductions create the need for fast-changing, short-term sales incentives, whereby the challenge is to carefully manage discount practices. The appropriate EIM solution has to take the guesswork out of launching compensation programs by allowing managers to model new plans before rollout, while sales and compensation staff should be able to see at a glance how well sales strategies are working via alerting and portal applications. The solution also has to be able to associate incentive pay to specific discount practices, product mix, or other profitability measures to help maximize sales performance.

A projection of 11 percent growth in the telecommunications industry is resulting from increased sales forces, retail outlets, and a variety of tariff plans complicating incentive payment management. As telecommunications companies have increased the number of products and services they offer to the market, competition has intensified, resulting in lower prices for the consumer. Given these declining prices for products and services, telecommunications companies must increase their total number of customers to drive revenue and profit growth. The value of each customer has risen, indicating that customer retention (reduced customer churn) is imperative. Sales teams and retail outlets are compensated not only for signing up new customers, but for signing customers who keep their services activated for at least six months or longer. In addition, sales representatives and distributors are rewarded for up-selling and cross-selling additional products and services to customers. Other notable business challenges include improving customer service capabilities to bolster customer satisfaction, developing new products and services and business models to protect against competitive threat from new technologies (such as voice over Internet protocol [VoIP]), and implementing strategic partnership models to expand distribution capabilities

An EIM solution for telecommunication should thus provide the flexibility such companies require to keep compensation plans in line with continually changing, competitive market conditions. Using the process model, telecommunications providers must be able to quickly and easily implement compensation plans with multiple performance measures to address business issues such as customer retention and product mix. Stepping-stair matrices can be used to reward sales teams for selling product bundles, where each additional product included results in a higher commission rate. The alerting and reporting applications should enable providers to share performance and compensation payment data with sales teams and retailers, securely and frequently via the web, thereby motivating the sales force and distribution channels to reduce service deactivations, and to up-sell additional products and services.

Last but not least are the life sciences industries, with great EIM growth projections of about 30 percent. Life sciences sales representatives are offering more products to their sales targets, but are spending less time with each physician. Sales representatives must continually sharpen their go-to-market strategies and product pitches to stay ahead of the competition, whereas sales teams have to deepen their knowledge of each medication or medical device, while learning how to communicate the benefits more crisply. In the future, life sciences companies may better use longitudinal data, such as patient demographics, in addition to traditional EIM data when devising sales and marketing plans. Incentive compensation programs are critical to the sales efforts in this industry, since rewards and contests are commonly used to supercharge a new product introduction and to drive competitive wins and market share gains. Territory management is vital to ensure the proper coverage models for categories of medications and groupings of physicians. Often, as many as seven different sales representatives from one company call on the same doctor, which makes managing territorial splits and account reassignments crucial. Another business challenge stems from the need to maintain an auditable record of drug sales to comply with government regulations.

With detailed visibility into business activities and sales performance, an EIM package should allow life sciences companies to create and implement compensation plans that meet their unique corporate goals. Sales and compensation staff should thereby also be able to drill down into performance for specific plans and sales sectors, customizing and refining strategies for precise target markets, while an automated rule-based system should allow compensation staff to easily manage programs for many product lines across complex territories, meeting the demands of fluid sales structures.


What Makes Incentives and Compensation So Tricky?

Managing incentive compensation presents challenges to almost every large and midsized company, due to the complex nature of the calculations. Such calculations might involve determining whether the sales plan is to pay on profit, or rather pay on market share; whether it involves multiple payees per transaction, etc. Other considerations include the high levels of security required (owing to the numerous and diverse authorization levels, and the fact that data feeds are coming from many external data sources), and the dynamic nature of the sales environment that imposes the need for visibility of millions of report pages per week containing all pertinent sales, customer, and corporate data. Further complicating these considerations are intensive integration of divisions, legacy systems, upstream and downstream connections, and so on. Last but not least, there are also people issues to manage, such as disputes, approvals, reorganizations, and self-service capabilities.

Part Three of the series Thou Shalt Motivate and Reward Workforce Better.

Going deeper into these issues, incentives can be defined in very complex ways, including percentage rate commissions, step rates (whereby incentive rates can be accelerating [an incentive rate that increases after a specific level of performance is attained] or decelerating [an incentive rate that decreases after the attainment of a certain performance level]), and threshold bonuses (the lowest performance level that must be achieved in order for an employee to earn an incentive payment). Other methods include sales promotion incentive funds (SPIFs), which is a loose term referring to an on-the-fly addition to the compensation plan used to motivate the sales force in a particular way by providing additional sales credit or payment for certain types of sales. Also standard are overrides (manual replacements of a value that the system has calculated with another value—for instance, when a person from the bench, for whatever reason, has to come and close a deal for someone who was supposed to do it); draws (cash payment advanced against future income that can be non-recoverable [a guaranteed minimum level of future income] or recoverable [a minimum level of future income that may be recovered from calculated incentive earnings]; draw accounts (usually, an allowance given to sales people working on a straight commission as an advance against commission payments); and many more methods and factors. Incentive compensation can be particularly difficult to calculate for various other reasons:

* Many people can get partial credit for a single transaction; split credits then have to take place when a sale qualifies for inclusion in compensation purposes for one or more employees.
* Territory boundaries can be defined by geography, product, area code, or a myriad of other factors, since territory is a way of defining which transactions a participant should be credited with (it is usually a geographic area, but can also be an industry or a specific set of customers).
* Incentives can also be paid on a rolling basis; at regular intervals (quarterly, monthly, yearly, etc.); or via a customized schedule. In general, an incentive formula represents the mathematical method of ascertaining how pay opportunity relates to performance, and how one pay opportunity relates to another for determining payout.
* Transactions can occur at very high volume.
* Different incentive plans can apply to different job descriptions.


To illustrate the complications of a shared ownership, in complex sales environments one customer may require several salespeople at different stages in the customer's life cycle, but each salesperson must have a personal reward structure. The compensation system must thus be well aligned with the job. For example, one provider of medical imaging products found that rewarding salespeople and field engineers separately for their specific role in the sales cycle not only improved performance, but also boosted incremental sales. Hence, its sales professionals are now compensated for selling the imaging systems themselves, since its field engineers, deployed remotely, are in the best position to sell extended contracts as customers approach the end of their original warranty period. The field engineers are rewarded separately for customer retention efforts when they sell extended contracts, or cross-sell or up-sell other products. As a result of the compensation strategy and the software to support it, the company has seen an increase in contract renewals, and it can renew contracts faster, eliminating any lapse in coverage.

For background information on incentives and compensation see Thou Shalt Motivate and Reward Workforce Better, and Are Sales Incentives Even in tune with the Corporate Strategy?.

Solution Requirements

Thus, an astute performance and incentive compensation management solution should solve these calculation challenges by providing a rule-based system that can perform the following functions:

* Divide the compensation calculation process into four logical steps:
1. credit;
2. measure;
3. reward; and
4. deposit.
* Maintain complex territory definitions.
* Assign credit to multiple parties, as defined by the company's sales compensation plans.
* Model and deploy, via a rules engine, the most complex of compensation rules and plans.
* Create multidimensional and multicurrency look-up tables to help maintain control over sophisticated calculations.
* Maintain a highly flexible assignment methodology that ensures each payee is covered by an appropriate compensation plan.


As mentioned earlier, visibility and transparency are key requirements, both to the people being paid, and to the companies that are paying. In addition, new legislation also demands unprecedented transparency for corporate financial records, starting with the US Sarbanes-Oxley Act (SOX) requiring tighter controls and visibility into all financially significant business processes (including incentive compensation). Furthermore, transparency and disclosure of broker commissions, as well as the ability to track whether the channel is meeting the required evolving licensing and educational requirements, is a critical issue in the insurance industry. In general, third party sales representatives require detailed compensation information to maintain loyalty, while shareholders insist on clearly stated returns on investment (ROIs), such as sales performance obtained through incentive compensation. Therefore, the appropriate solution has to provide the visibility that companies need by establishing tight process control over incentive compensation, and then make appropriate information available for review through a reporting application. Such information is used to provide payees with detailed information about their goal attainment and related compensation, as well as to provide managers with the information they need to make decisions about their incentive compensation investments, and visibility into the incentive compensation process to all parties involved.

Dispute resolution challenges arise out of constant changes to territories, organizational changes, miscoding of sales transactions, or other timing issues that cause transactions to be credited to the wrong payee. As mentioned earlier in this series, if not dealt with quickly and effectively, compensation disputes can cause loss of morale among the sales force and distributor network; loss of direct sales, third party brokers, and dealers to competitors; overpayment of compensation to undeserving parties; administrative bottlenecks that draw resources away from more important tasks; and overload on information technology (IT) resources devoted to the research and resolution of disputes. Again, a proper solution has to speed automatic resolution to compensation disputes by leveraging the vast store of knowledge in the compensation repository to research the validity of claims for compensation. It should then intuitively present the results of its research, and in many cases, present a recommended resolution for the dispute. Such a nifty application can not only produce significant savings in overpayments, as well as in countless hours of administrative and IT time, but also improve relations between compensation departments and their sales people, along with third party brokers and channel partners.

Adding to the complexity are tremendously high transaction volumes (with millions of transactions, thousands of payees, and millions of dollars paid per period) across a broad product portfolio, and a diverse sales network of direct sales (field sales, sales managers, sales engineers, and internal sales) and channels (partners, independent agents, retail stores, brokers, dealers, value-added resellers [VARs] and other third parties), as well as requirements for audit trails and corporate governance. Consequently, incentive compensation can be difficult to manage because organizational changes occur and are unpredictable; territories change; personnel get re-assigned; brokers enter or leave the network; market conditions change (requiring new incentive strategies); and organizations merge (thereby combining different compensation plans and cultures into a single company).

As indicated earlier, such companies need to model or project the cost of compensation pay in order to tackle the major business challenges of planning, budgeting, and forecasting. Namely, as companies move beyond the midpoint of a given fiscal year, executive and sales management often wonder how different market-driven compensation plans would have performed. Executives may want to take the company's actual revenue and transactional results and track them against proposed compensation plans, in order to see how much more or less they might have cost in incentive pay. But, as companies enter the latter part of a fiscal year, then they need to start designing compensation plans for the following fiscal year. They must be able to design hypothetical compensation plans and run hypothetical revenue and transactional scenarios against them, to ensure that the projected cost of compensation is within budget guidelines. Finally, once companies are in a new fiscal year and compensation plans are in place, they need to forecast the cost of compensation on a regular basis (typically monthly). Companies take the compensation expense forecast and book it as an accrual on their income statements.

This is a shift away from the broken ritual of the annual budget, whereby traditionally, budget forecasts and analyses were made once a year; nowadays, those figures need to be constantly updated. Yet, most organizations do not have the data or the tools required to effectively model incentive compensation for planning, budgeting, and forecasting, and must rely on inaccurate data to come up with cost projections. Furthermore, for most companies, the ability to model even simple adjustments is arduous, and the ability to fully assess the impact of annual changes or the common mid-year tweak is virtually impossible. As a result, actual compensation expense can vary drastically from budgets and forecasts, thereby impacting earnings per share and stock price.

Are Sales Incentives Even In Tune With the Corporate Strategy

Incentive compensation plans are designed to motivate sales and service professionals to achieve goals and strive for excellence. But an alarming fact is that these same compensation plans are often at odds with the corporate strategy of customer satisfaction, since sales employees, in their zeal to earn more, often lose sight of what is importanttheir customers needs and the companys strategy.

Part Two of the series Thou Shalt Motivate and Reward Workforce Better.

If the company wants to increase sales of a new product line, for example, but the direct sales and indirect channel still receive hefty incentives favoring existing product lines, the sales folks will logically not care to pursue sales for the new (unrewarding) product line. Conversely, if a new plan at a research or analyst house compensates salespeople only on ongoing usage of research (and not, for example, on original contracts), will fewer new contracts be written as a result? Also, what if a manufacturing companys sales folks are paid on the volume of purchase orders, and keep selling under heavy discounts, or by over-promising non-existing features to customers? The companys profits will likely quickly dwindle, and there are many examples of companies paying immense sales commissions to their sales force (who have, to be fair, all reached their quotas, even if the quotas were inadvertently set wrongly by their superiors), even as the companies in question suffer terrible losses, and possibly go out of business. For more on pertinent issues, see The Case for Pricing Management.

One of the things that is often missing is a good system of metrics for gauging whether the incentive plan is optimally driving revenue. With the wrong metrics or incentive plans, every company is essentially just going out of business faster. Therefore, companies must make sure that they are paying for the most important sales activities, and that those activities are connected to their business strategy and positively contributing to the bottom line (profits). Compensation should pay sales and service personnel to achieve those specific results, even if it means finding a magic formula via a combination of ever-shifting factors, which might include profit, quotas, customer retention, customer satisfaction, product mix, team-based metrics, etc. Finally, calculating compensation can now even be taken outside the sales force to involve those employees who do not directly drive sales. Examples might include paying a marketing person for annual product growth, for designing a product promotion; or a pre-sales person for great software demonstrations to customers; or a specific channel business development person, for channel growth and profitability.

Conversely, rewarding employees for the wrong results can prove especially lethal to customer service efforts. A common call center mistake, for instance, is to reward agents based on performance (such as the number of answered calls) during peak hours, since this can lead to dishonest, yet commonly used, techniques that can dangerously irritate customers. In an effort to reach their call volume goals, agents may place customers back in the call queue (remember the annoying Please hold. Your call is important to us. line?), pass the caller to another agent, or simply hang upall in an effort to quickly get to the next caller, which can severely decrease customer satisfaction levels and lead to an increase in the customer base erosion. A much better strategy might be to reward agents not simply for call volume, but also for meeting customer expectations (surveyed by a third party), and ideally for the creation and accumulation of new improvement ideas from customers.

Another illustration is provided by an apparel retailer that often fell short of expectations when launching new lines of jeans. This prompted its decision to test market in-store jeans fitters, who were trained through e-learning about the products, how to fit jeans on women, and how to give the best advice. The project purportedly returned a 75 percent increase in revenue, since the fitters began receiving an incentive every time a pair of jeans were sold. There was a performance management solution in the background telling them how they were doing according to their goals, whereby learning, performance, and incentives were all tied together in an integrated way to drive corporate revenue and performance.

In the financial industry, if a bank stops giving tellers incentives for referrals to investment advisers, it should be able to model a what-if scenario of what is likely to be saved in incentives versus the gain from increased investment activity. Or, what if the bank takes away incentives for referrals, but raises incentives based on customer service ratings at branch or individual levels? An astute incentive and compensation software should be able to simulate whether the likely influx of new customers and a lower customer turnover rate will provide a bigger bang for the buck than if those incentives were directed toward investments. Such predictive analysis and forecasting capabilities also come in handy to evaluate various incentive plans, for items such as customer satisfaction, investment referrals, loan referrals and credit card sign-ups, on behalf of more than tens of thousands of eligible employees.

Indeed, pay-for-performance program management tools that allow companies to model, administer, report, and analyze a wide range of metrics by using rules (meaning ways to filter and calculate in the form of an "if-then" statement, where the "if" contains a Boolean expression that selects objects from the database, such as which transactions to use, and the "then" contains formulas that calculate and save new values) are quite superior to more traditional approaches. Such approaches might involve some hard-coded logic or restrictive predefined compensation models, which limit compensation plans to generic, lowest common denominator practices. When they are difficult to reprogram, with no modeling capabilities, incentive programs often cannot be crafted on the fly to meet competitive threats or executive orders, such as a demand to create an incentive plan that would help a bank generate 100,000 new checking accounts by the end of the fiscal quarter.

Ideally, such capabilities should enable the likes of chief executive officers (CEOs) and chief financial officers (CFOs) to quickly determine a company's most profitable behaviors, and then provide the intellectual capital (in the form of information) allowing a company to create pay-for-performance incentives that have a chance of working for the entire company. A dollar of sales is not worth a dollar unless it represents the most profitable sale the company can make, and variable compensation management software should provide the tool needed by senior management to ensure that the interests of all stakeholders can be well balanced, while also allowing management to choose the best path to profitable growth. Last but not least, managers can also use the applications to model changes to their incentive-pay programs in order to fully understand the financial effects of new rules before instituting them.

For background information, see Thou Shalt Motivate and Reward Workforce Better and Thou Shalt Manage Human Capital Better.

Pros and Cons of Homegrown Systems

Sales incentive and compensation software tools allow firms to make nimble, complex decisions about how employees should be paid for meeting and exceeding their targets, as well as helping the companies with getting the right products and services to market faster. The dearth of such packaged information technology (IT) solutions has resulted in the vast majority of companies still performing these tasks on rudimentary, pedestrian homegrown systems. The need for these tools has thus resulted in home as the place of origin for the majority of incentive tools. Indeed, most of the corporate world (up to 90 percent, according to a recent study by the Indian research and analytics company Evalueserve) still relies on tools designed and developed in-house, which means that a sales force's commissions and bonuses are being handled through a Microsoft Excel spreadsheet and Microsoft Access database combination (Excel being the front end, and the Access database occasionally representing the backbone).

To be fair, for companies with small sales forces, low transaction volumes, and straightforward sales plans (where IT departments can handle the workload without impeding the company's capability to introduce new products to market, or without hampering the sales force's ability to sell those products and stay motivated while doing so), an Excel spreadsheet on steroids might work well enough to help them keep track of commissions. Some environments with rudimentary sales forces and sales incentive plans can leverage homegrown sales compensation tools to calculate income and provide motivation to make the most of the existing products, services, and customer relationships.

It is interesting to note that company size in terms of revenue is not a determining factor in deciding whether to use a homegrown solution or a packaged software solution for incentives management. A large global aerospace and defense (A&D) corporation might have revenue in the billions of dollars, but if it is only selling a few complex products (such as jet planes or rocket boosters) to the likes of the US Department of Defense (DoD) or National Aeronautics and Space Administration (NASA), the chances are that it does not need to invest significantly in a full-fledged enterprise-level incentive and compensation management package. On the other hand, for insurance, retail banking, and consumer goods environments, where one is selling hundreds of items in a week with different types of incentive plans, it becomes very difficult to calculate quickly and accurately enough to get payments out on time.

There are three determining factors for the delivery model a prospective user company should select when considering a compensation solution:

1. the size of its sales force;
2. the complexity of its sales plans; and
3. transaction volumes.

As a rule of thumb, homegrown spreadsheet-based solutions are best suited for companies with approximately two dozen sales representatives or less. The primary advantage of a company building its own system is the upfront license fee cost savings over paying for a vendor's packaged solution. The fees for a vendors packaged solution can range up to a few hundred thousand dollars. However, companies must keep in mind the requirements placed on their IT department when using a homegrown solution. As soon as there is notable sales force expansion or more complexity within the sales plan, a homegrown solution may quickly become inadequate. This is particularly true when one starts talking about credit assignments or splits, overlays, tier or ramped rates, and adjustments or overrides (as when products get returned); these events create a lot of variables, and user companies will be limited in functionality by the basic database, eventually needing the in-house capability to write a lot of code. Last but not least, system limitations can delay new products or services in getting to market, as a company's IT department will be too busy updating the corresponding incentive plan in-house to engage in facilitating sophisticated simulations and modeling.

Job Scheduling Maze in Distributed IT Landscapes – Part 1

We certainly learn new things every day, and sometimes out of pure serendipity. Namely, when I was recently asked by one of my industry contacts (working for a PR agency) whether I would like to have a briefing with and about his client, whose name included the word “batch” as a part, I agreed, thinking it was a process enterprise resource planning (ERP) vendor or maybe a manufacturing execution system (MES) vendor.

To my chagrin, as soon as the Web conference meeting and demo started, I realized that I was in the quite unfamiliar territory of enterprise job scheduling and workload automation, especially when it comes to highly diverse and distributed information technology (IT) environments. Many of these jobs are still run in a batch mode without human interaction and intervention (at least preferably).

Some batch job examples would be: database/data warehouse updates, payroll runs, file copying and archiving, systems rebooting, disks de-fragmenting, reports printing, processing insurance claims, billing statements, and/or enrollments, file transfer protocols (FTP), and so on. Thus came the ActiveBatch product name, I guess.

For someone who is not that technically proficient, the knee-jerk reaction was to say: “Sorry, this was a misunderstanding”, but I was drawn in by a good discussion and the product’s apparent usefulness. In fact, the briefing made me think about how often we take things for granted, and how unaware we are of the legwork that IT staff, especially within large corporations, is conducting behind the scenes. We are usually aware of the IT folks only when our PC is not working or the email server is down, and when we “need help now!”

The conversation made me realize that this vendor truly understands mission-critical business that requires high-performance systems. ActiveBatch originators know what it means to meet both deadlines and service level response times. Conversely, while the commonly used enterprise applications, databases, platforms (e.g., Microsoft Windows with Task Scheduler or UNIX with Cron) contain limited scheduling functions, these address only basic requirements within the confines of the individual system.

In fact, from my erstwhile experiences as a Baan (now Infor ERP LN) and SAP R/3 functional consultant, I vaguely recall some task scheduling functions (e.g., setting overnight or weekend material requirements planning [MRP] runs, or performing a trial balance and general ledger [G/L] updates at a certain time). The technical consultants would set up these batch jobs within the Baan Tools or SAP Basis administrative capabilities.

Overcoming Individual Systems’ “Autism”

However, the problem is that as the number of systems, applications, databases, and whatnot platforms increases, the IT business community requires automation among and across these various systems in an end-to-end manner to provide a single point of workload automation. In addition to the ability to integrate all the pieces of a heterogeneous environment, such a system should be able to schedule jobs based on events and associated built-in business logic, as opposed to merely on a set date/time, and in a single-path manner within a silo (as is the case with inwardly-oriented “autistic” enterprise systems).

To that end, ActiveBatch provides a central point of scheduling that allows each of these disparate systems to be automated and integrated into coherent workflows. The essence of ActiveBatch, Intelligent Automation, is to provide the level of integration for applications, platforms, databases, and specific functions without the need for costly and tedious code scripting (and reliance on programmers).

What prospective customers look to ActiveBatch to accomplish is the following:

* Improve IT service levels;
* Integrate workflows and business processes between diverse applications and platforms;
* Reduce the number of manual errors (which otherwise come with the territory of relying on humans);
* Implement a centralized view of jobs that span across the vast platforms’ landscape; and
* Eliminate an “artificial” wait or idle time that has to be built into existing workflows (to accommodate all imperfections).

In a nutshell, this is about ultimately reducing the cost of IT operations. Some customers cite the fact that often in the past the original enterprise job scheduler (typically made in-house in a heavily customized manner) would allow some jobs to fail without notifying anyone, and the company wouldn’t know anything about it until someone belatedly complained about not getting the output of the job (e.g., a report). Tracing and fixing these instances would often take more time and resources than the job scheduler was supposed to save in the first place.

ActiveBatch and Advanced Systems Concepts Inc. (ASCI)

But let me backtrack and talk about the ActiveBatch’s genesis. The company that owns ActiveBatch is privately held Advanced Systems Concepts, Inc. (ASCI), with headquarters in Morristown, New Jersey (NJ), United States (US). It was founded in 1981 as a system software engineering and consulting company focused on the development of products for former Digital Equipment Corporation’s (DEC) OpenVMS operating system (OS) product (meanwhile of course, DEC was acquired by COMPAQ, and now both are part of Hewlett-Packard [HP]).

ASCI’s first product, INTACT, was a transaction processing system for OpenVMS to allow customers to use OpenVMS systems in commercial applications. INTACT was licensed to major financial organizations around the world. In late 1986/early 1987, DEC exclusively licensed INTACT from ASCI, and renamed it DECIntact, as a solution incorporated as part of its erstwhile Enterprise Application Strategy. DEC’s decision at the time was based on the need to compete with IBM and its transaction processing system, Complex Instruction Computer Set (CICS), with a counterpart competitive solution. INTACT and DECIntact are both still in use in many organizations around the world.

ASCI has also developed other layered product solutions for the OpenVMS market including:

* SHADOW – the first shadowing/data replication system for OpenVMS;
* WATCH – for help desk and other similar functions that allow one user to watch other terminal sessions;
* Performance Simulation System (PSS) – the automated regression and application testing system for the OpenVMS applications; and
* VIRTUOSO – a virtual container technology that enables the development of virtual disks to be used as random access memory (RAM) and cached disks for performance improvements, encrypted disks for security, and more.

In 1991, ASCI enhanced SHADOW with a new family of products including FileSHADOW and RemoteSHADOW for OpenVMS. RemoteSHADOW for OpenVMS has since been installed in over 1,000 customer environments to reduce the time of data recovery in the event of a system or site loss. For instance, in 2001, during the dreadful 9/11 attacks, RemoteSHADOW for OpenVMS was used by many financial organizations, including Dresdner Kleinwort Wasserstein (DSW) Bank, to recover their data and get their business functioning again within hours at an alternate site.

In 1996, ASCI broadened its OS focus from exclusively OpenVMS to include UNIX and Microsoft Windows. One of the UNIX-based products that is still in use is DeviceShare. For its part, XLNT (Extended Language for NT), a command and scripting language for Windows (not only Windows NT, as the name would imply), was introduced to offer system administrators a scripting alternative to managing systems and developing workflow scripts. Over 100,000 XLNT licenses are currently in use around the world.

Enter ActiveBatch

In 1998, XLNT users needed a batch system to automate and run their XLNT scripts on a schedule. ASCI thus introduced Batch Queue Management System (BQMS) to assist XLNT users to automate their scripts on and across Windows servers. In 2000, BQMS was renamed and re-introduced as ActiveBatch V3, a heterogeneous job scheduling solution for Windows, UNIX , Linux, and OpenVMS systems.

ASCI proudly claims to be self-funded, and that its development, quality assurance (QA), and support teams are its own (and not outsourced or off-shored). The company has licensed ActiveBatch to over 1,400 customers in 34 countries around the world.

With its Intelligent Automation capabilities and performance, and having been tested across 2,000 servers, performing over 1,3 million jobs per day, ActiveBatch is fast becoming the Workload Automation and Enterprise Job Scheduling solution of choice. Additionally, ASCI has licensed nearly 4,000 clients in 34 countries around the globe for the full range of its products. Its clients include many of the Fortune 1000 companies with a mix of medium to large enterprises.

ASCI competes with several powerful and renowned application providers in the Workload Automation and Job Scheduling market including Computer Associates (CA) Unicenter AutoSys, BMC CONTROL-M, and IBM Tivoli. Many of these vendors’ products were originally developed for the mainframe, not necessarily for today’s heterogeneous, horizontal server environments. As a result, they have customarily been adapted (retrofitted) to today’s distributed server environments.

More modern job scheduling vendors’ products like Tidal Software, UC4, Redwood Software (especially in SAP environments), Quartz (open source), and of course ActiveBatch, understand the distributed server environments, and have targeted their solutions to address this requirement. Part 2 of this blog series will analyze the ActiveBatch architecture and evolution in terms of functional and technical capabilities.

Your views, comments, opinions, etc. about any above-mentioned enterprise job scheduling solutions and abut the software category per se are welcome in the meantime. I would also be interested in hearing about your experiences with these software solutions (if you are an existing user) or your general interest to evaluate these solutions as prospective customers.