Friday, December 4, 2009

Three Ways ERP Can Help Manage Risk and Prevent Fraud

Business is all about taking risks. But intelligent managers know how to manage risks, thus preventing accidental losses as well as other operational, financial, and strategic risks—including fraud.

To manage business risks by using technology, we must first understand and prioritize the risks a specific business faces, and then understand how IT can help that business. Then we can come to understand how those risks intersect with the IT systems a business might already have in place.

One risk within your business may stem from operating in an e-commerce environment. In that case, you want to know how IT is supporting the Web portal. Do people simply view a catalog, or do they order online and log back into your system later to view their order status? How does that portal tie in with your back-end systems and business data?

Or maybe you have multiple business units, several running on a top-tier enterprise resource planning (ERP) system like IFS Applications. But a Mexican unit is still running a homegrown application, passing its data to you in spreadsheets modified to reflect currency exchange. The manual processes involved in this data transfer and data alteration represent a business risk that could be mitigated by the built-in security features of an ERP system.

So, while technology might be designed to assist in risk management, that technology must still be configured and used intelligently to deliver this business benefit.

Indeed, intelligent use of an ERP system can not only help ensure compliance with legal requirements and accounting rules, but it can also help prevent fraud. An ERP application and its user permissions settings can prevent theft. Aggressive and intelligent use of an ERP system's safeguards can save time during auditing. Properly configuring an ERP application can help protect your company from fraud and costly corporate mistakes in a number of ways. Following are three practical approaches a business can take to protect its assets through its ERP system.

1. Use a top-down approach to identify risks.

Business risk management requires a top-down approach. Senior management often focuses its efforts on creating competitive advantages and might not see one in spending extra money on compliance. But even companies not immediately affected by regulations like the US Sarbanes-Oxley Act (SOX) and the Health Insurance Portability and Accountability Act (HIPAA) of 1996 can benefit from applying some of the principles required for compliance to their business. Efforts to comply with basic data security and risk prevention guidelines can even further reduce the risk of financial loss through administrative mistakes or fraud. The specific steps necessary to ensure compliance with these guidelines will differ from one company or business model to the next, but any company needs to pay attention to such basics as good financial statements, data security, privacy, and housing of key information—and how that information affects things like ensuring accurate financial reporting.

Part of this top-down approach involves identifying what information is key to your business. For a manufacturer, this data might consist of accounting, payroll, and health insurance information, plus things like physical plant assets and inventory. In contrast, a professional services environment is much simpler, with key information consisting of things like customer service and payroll data, with the only other real assets consisting of phones and perhaps leased office space.

Your Reference Guide to SMB Accounting Software Features

So, you're looking for an accounting system.

This reference guide provides insight into the accounting features and functions currently available on today's market for small to medium businesses (SMBs). It will help you determine which features your organization needs—or doesn't need.

You can also download an extended guide in Excel format at TEC's Accounting Software Request for Proposal (RFP) Template page.

But first, here's a brief overview:

What Are Accounting Systems?

Accounting systems manage procedures for accurately entering, tracking, and maintaining information related to an organization's financial operations. These accounting applications typically support general ledger, accounts payable and accounts receivable, payroll, job and project costing, and multinational accounting.

Many SMBs require that other functions (such as inventory control, manufacturing management, and financial reporting) also integrate with their accounting system.

About This Guide

Although a full accounting system RFP can contain upwards of 4,000 features and functions, we'll focus on the "big picture" features for now, for (obvious!) considerations of space.

You'll notice that we've grouped accounting features by broad category. These categories correspond to a high-level functional breakdown of software features. In this reference guide, we provide a brief explanation of how each category impacts your accounting processes.

If you'd like more information about a full listing of accounting software features and functions, please visit TEC's RFP Templates page.

Reference Guide to SMB Accounting Software Features

1. General Ledger

Chart of Accounts
The chart of accounts is, for all practical purposes, the business management system. If revenues and costs are not captured and segregated into the best suited categories, the financial statements you produce will be useless.

Transaction Processing
This category describes features that address typical journal entry processes, including general transaction processing, workflow period closing, batch layout configuration, and job cost adjustments.

Month- and Year-end Closing
While you can bill revenue and collect cost information, if this information is not published in the form of financial statements in a timely manner, the statements themselves are essentially useless.

Control Reports
All business management systems must have some form of controls to make sure information is input correctly. Software features covered in this category are designed to accomplish this task.

Financial Statements
Financial statements drive the company. However, for smaller companies this may not be true to the same extent, since the owner or manager should have a "feel" for operations rather than relying on printed reports. Larger companies cannot do this, simply because they are too big.

2. Accounts Payable

Vendor Master File
Master files are the starting point in any application. For accounts payable, the vendor master file must be set up first, as that drives the rest of the accounts payable functions.

Purchasing Controls
While anyone can issue a purchase order, the process should be controlled. This category covers the purchasing process as well as control systems you can use.

Data Input
Once a purchase order has been sent and goods received, the obligation for that purchase needs to be recognized. This category reviews the various steps required to actually get information into accounts payable.

Payables Analysis
Once an invoice has been input, it needs to be approved and scheduled for payment. This category covers those steps.

Check Writing
Once an invoice has been processed and approved, it needs to be paid. This category addresses various check-writing features, including bank account assignment and check formats.

Control Reports
While you may choose to assume that information has been input correctly, that is not always the case. The features in this category address reports that give users the ability to check information to make sure it has been input correctly.

Financial Reports
Once data has been input into accounts payable, users will probably need to review slices of that data to determine if costs are in line, where costs are being incurred, and how those costs compare against other benchmarks.

Employee Training in a Recession



Get the RFP Templates that List up to 4,100 Software Feature Functions!

As organizations reassess their staffing levels, many employees are being asked to do more with less. Aside from reducing headcount, many organizations are cutting back on employee-related expenses, even if they can provide long-term benefits. Examples include application training and travel to user groups in which employees can network and exchange best practices. This article discusses the increased importance, benefits, and risks related to employee training in a recession with respect to enterprise systems.

Growing Organization Risks

While understandable and often imperative for the continued survival of an organization, the aforementioned cutbacks promote a vicious cycle of increased organizational risk:

* Organizations reduce or eliminate formal training and informal opportunities for users to learn how to better utilize enterprise systems.
* This solidifies many users' bad habits and suboptimal processing methods.
* At the same time, organizations trim staff, resulting in more work among fewer employees. This means even less time for cross-pollination where employees are trained in multiple jobs.

Organizational risk is compounded if key employees leave the organization and, as is often the case, user documentation is lacking. For example, incumbents may scramble to figure out how Alex ran regular interfaces, Neil matched invoices, Julian filed tax reports with the government, and Nancy created database backups. If Alex, Neil, Julian, and Nancy are no longer with their organizations, then they are, in all likelihood, unable or unwilling to assist their former employers in the event that their help is needed.

Often, the best case scenario is that jobs performed by ex-employees are partially understood by their replacements. Nonetheless, this may very well result in increased risk of error, financial irregularities, expensive engagements with external consultants, or some other highly undesirable outcome. In the extreme, a single employee's departure may result in a missed payroll, an eventual government audit, or security breaches.

Opportunities and Benefits

Organizations with tight budgets may not need to reduce headcount at present. There is a fundamental tension between lean staffing levels and organizational bench strength. Lack of widespread end user application and technical knowledge is dangerous in the event that a key employee decides to walk. Yes, even in these economic times some employees voluntarily leave their jobs for whatever reason.

To this end, organizations should consider expanding employee training, not cutting back. Whether employees are being cross-trained in different functions or learning new technologies altogether, the benefits of training can more than offset their costs. First and foremost, training mitigates the risk of key employee turnover. Second, the mid- or long-term savings of training may more than pay for itself. Two super users with substantial skills and a global perspective may be able to do the work of three or four limited end users, especially if they are skilled in different automation methods. Finally, while hardly tantamount to reassuring nervous employees about their employment futures, training can send a strong message to attendees: the organization wants you to develop your skills. And the message becomes "despite current economic challenges, we are committed to growing our employees' skills and abilities." This attitude may reduce the likelihood of voluntary employee attrition.
Once the organization has decided to move forward with training, it has a fundamental decision to make. Where will the class be held?

Organizations that want to build internal expertise in new applications have two choices: They can either send their employees to public or private training classes. Public classes typically take place at vendors' offices or at vendor-approved locations. These classes cost in the neighborhood of $500 per day per student. Many organizations in different stages of an implementation send users to public classes to learn how their systems work in a generic sense. In other words, a payroll manager should not go to a public class intent on learning how to set up and process payroll at her company, although she should walk away with more than a few ideas from the class. Because payroll personnel from other organizations attend public courses, the instructor will discuss the payroll application in general terms.

For public classes, clients travel to vendor sites, sometimes incurring significant travel costs. To the extent that client end users are out of the office, they should be able to focus exclusively on the class and the applications being taught. From a technical perspective, vendors should have sufficient computer terminals and training data areas. In other words, clients need no organizational IT involvement to attend a public class, nor do they necessarily need to bring laptops with the applications already on them.

Private classes are very different than public ones, both in terms of costs and content. For one, it's not uncommon for a vendor to charge upwards of $3,000 or more per day for a customized class at the client's site, because vendors know that client end users will not have to incur travel costs. Thus, from a strict cost standpoint, a private class with more than six people will probably be cost-effective for the organization. As for content, instructors will typically customize agendas specifically for each client. In a private payroll class, for example, the payroll manager can ask many specific questions related to her company's payroll setup and processing.

While, it may be less expensive for clients to host private classes in which trainers come to them, understand that employees attending private classes are in the office. Crises or emergencies can take them away from the class, reducing overall learning. Also, from a technical perspective, the trainer is not going to bring laptops configured with the software and training data areas. Consequently, the amount of IT involvement is much greater than that of a public class. The organization that brings in an instructor at $3,000 per day should ensure well before trainer's arrival that its hardware and software are "up to snuff". Nothing inhibits a class and frustrates all concerned more than "buggy" software and the lack of a proper training data area. The last thing that a client's management wants from a public class is a disaffected end user base.

Outside of a formal class (whether public or private), independent learning has become more populate. Recent advents such as web-based training (WBT) have become increasingly popular. While the cost savings are obvious and the convenience factor is high, remember that employees at their desks are often distracted by daily calls, e-mails, and old-fashioned door knocking. Consequently, the cost of a public course can sometimes be justified by the additional learning that tends to take place in an isolated environment.

Considerations and Caveats

Training for training's sake is fruitless. Organizations need to ensure that their training investments will result in tangible benefits. Users may learn a robust new technology over the course of a three day class. However, this certainly does not equate to mastering it or deploying it in the organization, even for highly motivated and skilled attendees.

HR Software Selection: It Ain’t Rocket Science

So … you're either in the market for your company's first human resources management system (HRMS), or you're ready to move up to a more sophisticated system, and you're struggling with whether to bring in the consultants or tackle the evaluation process by yourself. Can you do it yourself? The answer is yes, and maybe. It all depends on whether or not you can commit the time and resources to doing it right.

Be prepared for a fairly time-consuming process, ranging from at least three to nine months or more. You'll also need to recognize that it will require a considerable outlay of capital and staff resources to bring the project to fruition. Your company will have to live with your decision for the next eight to ten years or more, so you want to make sure that you do it right—the first time.

For starters, it will be extremely helpful to begin with a clean slate and an open mind. The less biased you are regarding a specific software vendor or application (either “for” or “against”), the easier it will be to make an objective evaluation and decision. In the last five years or so, vendors, software applications, and hosting options have undergone significant changes. Making a decision based solely on past experiences (good or bad) may be a disservice to your organization. Try to remain unbiased throughout the evaluation process. Whether you use a consulting firm to help you in your quest, or decide to strike out on your own, following a rock-solid methodology is the key to success.

User-Needs Assessment

The process begins with a user-needs assessment (UNA) to develop a wish list of features and functionality you expect the new system to deliver. In this phase, brainstorming sessions with each group of functional and technical users are scheduled to discover what they like about the current system, what they dislike about the current system, and what they are looking for in a new system.

There are various ways to build this information. Some people use questionnaires and surveys to collect this information, and some use the tools that software evaluation companies like Technology Evaluation Centers (TEC) offers to help organizations build their requirements. In addition to these tools, the interactive nature of face-to-face meetings with small groups of users allows useful information to be exchanged to help the decision process along. Some users may know exactly what they want out of a new system, while others may have difficulty envisioning their future needs. You'll have to facilitate the discussions and prod the users by asking such questions as “What are you doing manually today that would save you time if it were automated?”

While you're conducting these sessions, be sure to ask how new features and functionality will affect efficiency and productivity. This information will help you later when you prepare a cost-benefit analysis to justify the expense of the new software.

Next, take a look at some of the software vendors present in the marketplace. In the mid-market, there are no less than 20 vendors that could be considered “players” (prominent in the industry), and there are many more that are trying to enter this market. There is no way that you can evaluate all of them in depth, but there are some shortcuts that you can use to narrow down the choices.
One option available is to see what the analysts have to say about the major players in the HRMS software arena. Gartner's Magic Quadrant for HRMS, Forrester's Wave, or TEC's eBestMatch™ are all effective decision support systems offered by software evaluation organizations that can help you evaluate different vendors.

Another option is to attend conferences, such as those held by the Society for Human Resource Management (SHRM) and the International Association for Human Resource Information Management (IHRIM). These conferences usually have an exhibit hall, and the major HRMS vendors will each have a booth and product specialists on hand to discuss their offerings and to provide demos of their solutions. Often they will arrange a vendor shoot-out, where each vendor demos its solution to the assembled attendees, and the attendees decide for themselves which offers the best solution.

You can also use the Internet to research the vendors, but beware: this could turn out to be the equivalent of looking for a needle in a haystack. On the day I wrote this article, I did an Internet search on “HRMS software vendors,” and it returned 728,000 hits. You are going to either have to narrow your search, or be prepared to do a lot of surfing.

One other way is to do some sleuthing and try to find out what your competition is using. There are several vendors that have carved out a niche in the marketplace, and that have specialized solutions tailored to specific market verticals (such as health care, professional services, manufacturing, etc.)—a case of “birds of a feather …” This research should allow you to narrow your choices down to a handful of promising vendors. Call each one to request a copy of its company's current product information, and see if it has an online demo to provide a high-level familiarity of its products. This should help you come up with a shortlist of perhaps three or four prospective vendors.

Breaking with Tradition

Traditionally, the next phase of the project would involve issuing a request for proposal (RFP), in which you would draw up a list of high-level requirements and submit it to a large list of HRMS software vendors. After reviewing your requirements, interested vendors would notify you if they were interested in competing for your business.

This approach is time-consuming because you have to wait for vendors to complete their reviews of your requirements and contact you with their intentions. The next step would be to draft a request for information (RFI) to send to those vendors that chose to compete. The RFI should describe your requirements in more detail, list your project goals and objectives, provide a high-level project timeline, and request background on the vendor (including information about its proposed solution, the technical architecture employed, implementation approach and methodology, hosting options or partners, references, testimonials, and pricing strategies).

I recommend skipping the RFP, and start contacting the three or four short-listed vendors that you feel may have the best solution for your needs (based on your research). Your goal is to determine their willingness to conduct a scripted demonstration (versus a canned sales demo). This requires the vendor to “script” (prepare with detail) the demo according to your specific needs and scenarios.

Once you have the vendors' commitments, you can then send them a detailed RFI, and schedule the scripted demos. By skipping the RFP process, you can reduce the project timeline by as much as a month.

Your Reference Guide to SMB Accounting Software Features

This reference guide provides insight into the accounting features and functions currently available on today's market for small to medium businesses (SMBs). It will help you determine which features your organization needs—or doesn't need.

You can also download an extended guide in Excel format at TEC's Accounting Software Request for Proposal (RFP) Template page.

But first, here's a brief overview:

What Are Accounting Systems?

Accounting systems manage procedures for accurately entering, tracking, and maintaining information related to an organization's financial operations. These accounting applications typically support general ledger, accounts payable and accounts receivable, payroll, job and project costing, and multinational accounting.

Many SMBs require that other functions (such as inventory control, manufacturing management, and financial reporting) also integrate with their accounting system.

About This Guide

Although a full accounting system RFP can contain upwards of 4,000 features and functions, we'll focus on the "big picture" features for now, for (obvious!) considerations of space.

You'll notice that we've grouped accounting features by broad category. These categories correspond to a high-level functional breakdown of software features. In this reference guide, we provide a brief explanation of how each category impacts your accounting processes.

If you'd like more information about a full listing of accounting software features and functions, please visit TEC's RFP Templates page.

Reference Guide to SMB Accounting Software Features

1. General Ledger

Chart of Accounts
The chart of accounts is, for all practical purposes, the business management system. If revenues and costs are not captured and segregated into the best suited categories, the financial statements you produce will be useless.

Transaction Processing
This category describes features that address typical journal entry processes, including general transaction processing, workflow period closing, batch layout configuration, and job cost adjustments.

Month- and Year-end Closing
While you can bill revenue and collect cost information, if this information is not published in the form of financial statements in a timely manner, the statements themselves are essentially useless.

Control Reports
All business management systems must have some form of controls to make sure information is input correctly. Software features covered in this category are designed to accomplish this task.

Financial Statements
Financial statements drive the company. However, for smaller companies this may not be true to the same extent, since the owner or manager should have a "feel" for operations rather than relying on printed reports. Larger companies cannot do this, simply because they are too big.

2. Accounts Payable

Vendor Master File
Master files are the starting point in any application. For accounts payable, the vendor master file must be set up first, as that drives the rest of the accounts payable functions.

Purchasing Controls
While anyone can issue a purchase order, the process should be controlled. This category covers the purchasing process as well as control systems you can use.

Data Input
Once a purchase order has been sent and goods received, the obligation for that purchase needs to be recognized. This category reviews the various steps required to actually get information into accounts payable.

Payables Analysis
Once an invoice has been input, it needs to be approved and scheduled for payment. This category covers those steps.

Check Writing
Once an invoice has been processed and approved, it needs to be paid. This category addresses various check-writing features, including bank account assignment and check formats.

Control Reports
While you may choose to assume that information has been input correctly, that is not always the case. The features in this category address reports that give users the ability to check information to make sure it has been input correctly.

Financial Reports
Once data has been input into accounts payable, users will probably need to review slices of that data to determine if costs are in line, where costs are being incurred, and how those costs compare against other benchmarks.

Wednesday, December 2, 2009

SalesLogix Version 6.0

On the same day, Best Software announced the availability of version 6.0 of SalesLogix, one of the leading small business and mid-market CRM products, which is reportedly built on a new architecture and with more than 200 product enhancements. The SalesLogix architecture was reportedly enhanced to increase developer productivity, enable quicker customization, and lower implementation time and costs, since the product is based on a 3-tier architectural environment with a standardized toolset for open development and possibly a rapid deployment. The architecture should facilitate data integration with applications built in Microsoft's Visual Basic, Visual Studio .NET (VS.NET), or any other development environment that supports Microsoft ADO (ActiveX Data Object) data access.

The new architecture should also allow for backward compatibility with existing customizations and add-on products, simplifying thereby the upgrade process for customers and partners. SalesLogix' administrators running SalesLogix 6.0 should also reduce the time spent on manual tasks by using the streamlined new-user entry process with user profile templates and the simple, flexible user and team security controls. Other enhanced processes include: account permission configuration for teams, advanced territory realignment with scenario analysis, improved integrity checking to eliminate "orphan" accounts, and easy identification of users and teams that have access to an account. In addition to the new architecture, users should benefit from the following enhancements:

* Sales Client User Interface — Users should now be able to manage multiple addresses within account and contact records, create account hierarchy and navigate among parent and subsidiary accounts, and launch customer and prospect location maps, websites, and e-mail with one-click web access. In addition, SalesLogix 6.0 will also ship with Crystal Reports version 8.5.

* Tighter integration with Microsoft Office — SalesLogix 6.0 reportedly offers one-click export to Excel for analysis and reporting. In addition, the product offers Word integration for �mail merge' facility and advanced Outlook integration.

* Significant Mail Merge improvements — This feature should be easier to use with template management; merge at contact, account, opportunity, or group level; and the ability to automatically exclude individuals based on their solicitation preferences. Users can now also attach a record or copy of the e-mail and attachments to a recipient's history records and automatically schedule activities as part of sales or marketing workflow.

* New AutoSync feature — For mobile employees, the product will now automatically synchronize in the background when a web connection is available, while employees can work without disruption, and no longer having to remember to manually synchronize their data when returning to the office.

* Enhanced Web Client functionality — SalesLogix customers should enjoy Outlook and Excel integration, Crystal Enterprise web reporting, mail merge with customizable e-mail templates, and the groups and query builder. The Web user interface has been updated with an improved design and additional functionality to mirror the rich functionality of the SalesLogix Windows client.

* Improved Support WebTicket — Designed to improve ticket workflow management, the Support WebTicket has a look and navigation that is very similar to the SalesLogix Support Client. Enhancements include: integrated knowledge base with keyword highlighting and automatic creation of frequently asked questions (FAQs); customer self-service portal with two-way communication; addition of activities and attachments to tickets by both employees and customers; and employee visibility to defects, return material authorizations (RMAs), and ticket changes.

Payroll Services Checklist

Before talking to a payroll services provider, you will need to know the following about your current situation:

* How many employees are in you
* organization?
* Will you
* company be in growth mode ove
* the next ive years?
* How quickly are you looking to introduce a payroll solution?
* What have you budgeted fo
* a payroll solution?
* Does you
* company have the IT resources to support an in-house payroll solution?
* What kind of lexibility in payroll timing does you
* company demand from a solution?
* Are you looking to gain insight from employee earnings data?
* Is linking payroll functionality with compensation analysis a business priority?
* How much control of private data are you comfortable relinquishing to a third-party provider?
* Does you
* company lay claim to stringent data security controls?
* What degree of customization do you expect from a payroll solution?

Stay on top of the software comparison process.

Just because you heard that one software application worked wonders for your competitor doesn't mean that the same software will help for your small to medium business.

And just because you heard that one software deal saved your competitor heaps of cash, it doesn't mean that the very same software will be a fantastic software deal for your small to medium enterprise.

Find out what you need from business and accounting software.

Want a little more help creating a handy list of your business and accounting software comparison criteria

Understanding the Ideal Candidate Page

For each high-level criterion in the TEC ERP Evaluation Center's knowledge base, there are four graphs. The first two graphs are baseline graphs. In the baseline graphs TEC normalizes all criteria to an equal relevance, which allows you to see how a vendor's product scores on its own merit, without regard to any one module taking precedence over another. By checking the vendor's results against a normalized baseline, you clearly see the modules and functionality on which the vendor puts the most emphasis.

The second set of graphs is prioritized according to groups of criteria. TEC adjusts the baseline in these graphs so that it corresponds to each vendor's focus. The prioritized graphs make the vendor's strengths stand out against its weaknesses. A group of criteria increases or decreases its contribution to the vendor's scores according to the type of support the vendor provides.

When you go through the graphs for a vendor, notice that in each set of graphs (the baseline pair and the prioritized pair) there is a global priority bar graph and a contribution analysis spider graph. You can look at the global priority graph and by glancing at the height of its bars, see the criteria that are the vendor's greatest strengths. By comparing the baseline graphs to the contribution analyses you will see what the vendor supports in relation to a benchmark of the criterion's optimal contribution.

Saturday, November 7, 2009

Inter/Multi-Company Accounting

A particular differentiation would be Solomon's inter/multi-company accounting capabilities, which are pervasive throughout the entire product's functional footprint (even in the AR module, which is not easily encountered within other accounting products including MBS Great Plains), and the Inter-Company Export/Import tool takes information from multiple databases and uses them in transactions, while users can define access rights to the particular company data. Multiple companies in separate databases will have the same account structure but different charts of accounts. Although the invoices for different companies are entered in the same batch, checks are printed by company and sales tax history, for example, is listed by the company as well. Solomon is one of a few rare accounting products in its class where one supplier's master data can be shared amongst multiple companies, and the product also features a multi-company cash disbursements capability.

E-Business Modules

Solomon was also one of the first products amongst its peers to feature a number of e-business modules that provide a browser-based portal and electronic data interchange (EDI) features that let employees, customers, and business partners interact and retrieve important information from the company's accounting system through an Internet connection, without a need for Microsoft Terminal Server (MTS) or any other like web-enabling hardware component. The well-publicized Solomon Desktop module was the first one launched back in 2000 to allow secure, browser-based, anytime-anywhere access to all Solomon modules. The e-Commerce Gateway is another module that lets a company transmit and receive paperless electronic documents from business affiliates and partners. The module maps raw EDI data into meaningful Solomon business transactions and verifies the data to be added to its databases. The eVoucher application is used for the entry of AP vouchers and vendor account maintenance, which comes particularly in handy for remote locations.

Solomon Stands the Test of Time Despite Changing Masters Part Three: Product Differentiators

Furthermore, as mentioned earlier, MBS Solomon offers several series or groups of integrated modules that address different business types and needs. This modularity has the advantage of allowing the prospects to start frugally with only necessary base modules (e.g., accounting) and to incrementally add more functionality as needs demand and budgets permit, without the complexities of switching accounting application vendors or converting databases.

Of all the MBS' products, Solomon is apparently the purest in terms of a standard Microsoft technology stack, and without any proprietary additions (such as Great Plains original Dexterity environment, Navision's proprietary integrated development environment C/SIDE, which includes a proprietary Navision Server database and a proprietary 4GL programming language; Navision strong analytical features using Sum Indexed Flow Technology (SIFT); and the proprietary MorphX graphical development suite for Axapta). It is also a single-code product, with the same look and feel for both small and midsize customers, which has long differentiated the product from its competitors and MBS siblings (e.g., MBS Great Plains vs. Dynamics, Epicor Vista vs. Vantage, Best Software Peachtree vs. MAS 200, etc.) that currently offer separate products for the lower and upper ends of the mid-market. MBS Solomon Standard, a lower-priced offering of the Solomon edition that addresses the needs of lower mid-market enterprises with smaller information technology (IT) budgets and less-complex business structures (e.g., fewer companies or fewer divisions) that have twenty-five to ninety-nine employees, annual revenues of $25 million (U.S.) or less, and up to ten licensed users.

Furthermore, its sharp focus solely on Microsoft technology from the ground up, coined in "the power of one" motto (one OS platform—Windows XP/NT/2000, one database platform—MS SQL Server, one development environment—MS Visual Basic, etc.), also presents an attractive, risk-adverse option for penny-pinching mid-market customers. Solomon IV has consequently been very competitive in speed of implementation (from only two weeks to four months duration), feasibility of customization, total cost of ownership (TCO), and price/performance ratio. The product architecture has been devised entirely from scratch within the Microsoft context, which provides for flexibility and ongoing agility.

While its former and current competitors, particularly Sage (Best Software's UK-based parent company) and Great Plains, may have a more extensive partner channel within the industry, Solomon's indirect channel is more nimble and focused. This is due to its single-code product portfolio that reduces the deployment and support requirements for its entire market segment. In addition the former Solomon had supplemented its over 500 VARs through its above-mentioned STC network, which would provide for global service consistency and additional leverage for the channel.

Financial Series

At the heart of MBS Solomon is the Financial Series, which is made up of typical accounting and financial applications such as General Ledger (GL), Accounts Payable (AP), Accounts Receivable (AR), Cash Manager, Currency Manager, Multi-Company, Financial Statement Translation (FST), FRx Reporting andd Payroll/Direct Deposit. GL account and sub-account numbers can be up to thirty characters in length, whereby the main account number can be up to ten characters, and the remaining twenty characters can include up to eight user-defined segments. GL transactions can be entered using several types of transaction batches, including non-recurring, recurring, manual and one-sided adjustment, and GL account determines whether the transaction will operate in multi- or single-company mode. Transactions can be entered for any prior fiscal period or year as well as for future periods, which allows for things such as installments and prepayments to be managed at a single time, rather than month after month. The Financial Statement Translation module is compliant with Financial Accounting Standards Board (FASB) Statement 52, Foreign Currency Translation, and International Accounting Standard Board (IASB) Statement 125. The module supports translations from one set of books to another set of books, and supports multi-tier translations and consolidations.

Solomon Stands the Test of Time Despite Changing Masters Part Two: Market Impact

Still, due to its quite belated expansion into the ERP world (let alone extended-ERP), the vendor had suffered the reputation of a best-of-breed accounting software provider only. While former Solomon had accelerated its delivery schedule of new functionality, it was also hard pressed with tight "time-to-market" constraints and limited resources at the time. The following intended functionality delivery schedule in 2000 would have been a tall order for even much more resource abundant competitors, given that some of these have seen the daylight only within the above-mentioned releases within Great Plains and MBS, well after 2000—repetitive build and MRP modules within the Manufacturing Series (yet to be released natively); replenishing, commissions, and shipments modules within the Distribution Series; employee utilization and time and billing within the Project Series; additional e-business functionality like eVoucher; and Solomon Object Model within the System Tools suite.

However, given that all is well that ends well, MBS Solomon, due to its distinct differentiators (e.g., project management, advanced distribution, and field service capabilities, the product flexibility, having integrated but modular applications, powerful reporting capabilities, multi-company accounting features, etc.) and weaknesses (e.g., rudimentary manufacturing functionality), has been blessed in disguise with possibly the most distinct niche and the least overlap (gray area) with the other MBS ERP products (i.e., MBS Great Plains, MBS Navision, and MBS Axapta). Indeed, MBS Solomon remains the choice for organizations that seek flexible financial systems, integrated with distribution, project, or service operations. The strongest customer base thus comes from construction—special trade contractors, wholesale distribution—durable goods, business services, engineering, accounting, research, and management service organizations.

Solomon Stands the Test of Time Despite Changing Masters

Microsoft Business Solutions Solomon, formerly Solomon IV and Microsoft Great Plains Solomon IV, a prominent business management and e-business suite of applications for small and mid-market companies, and the product that some had prematurely written off after being acquired first by one of its erstwhile greatest nemeses, former Great Plains Software in 2000 (see Will Solomon Finally Satisfy Great Plains' Insatiable Appetite?), and particularly after its new owner subsequently ended up under Microsoft's roof in 2001 (see Microsoft And Great Plains — A Friendship That Turned Into A Marriage), only soon after to share the fraternity home with yet another former nemesis, Navision in 2002 (see Microsoft 'The Great' Poised To Conquer Mid-Market, Once and Again), seems to be doing just fine, if not even much better than that. It appears that the product has several truly differentiating traits, which cannot be easily or quickly replicated by its seemingly more robust brethren products within Microsoft Business Solutions (MBS) division. Thus, Microsoft has reason to continue to bolster the product for Solomon's loyal customer base and resellers instead of promoting less popular options (e.g., stabilization and replacement).

Most recently, in summer 2003, Microsoft Business Solutions (MBS) announced the availability of Microsoft Business Solutions Solomon 5.5, which includes several new features and enhancements in the product's Foundation Series, Financial Series, Project Series, and Service Series of modules. Owing to the product's renowned sweet spot of project accounting, MBS has further developed Solomon 5.5 to meet the needs of small to midsize project-driven organizations, specifically in the industries of business services, management and engineering services, social services, special trades contracting, general contractors, and wholesale trade (durable goods). To that end, Solomon 5.5 includes Microsoft Business Solutions Professional Services Automation (MBS PSA) product features that combine the power of MBS Project Accounting Solomon and the new enterprise version of Microsoft Project 2002 to provide externally focused, project-driven organizations with an integrated financial, project and resource management, knowledge management, time and expense, project accounting, financials, and reporting and analytics solution, based on the Microsoft .NET platform. Additional enhancements to the project accounting capabilities include new indirect rate calculation and new audit trail tracking abilities for contractors of the US federal government, particularly those subject to Defense Contract Audit Agency (DCAA) audits.

The MBS PSA vertical solution became generally available in North America at the end of 2002, following its quite vocal announcement during the Stampede 2002 partner conference (see Microsoft Lays Enforced-Concrete Foundation For Its Business Solutions).

Monday, October 26, 2009

Reflections on Lean Philosophy and the Theory of Constraints

This is Part Seven of a multipart note entitled Lean Manufacturing: A Primer.

For these reasons, a TOC production planning solution might be appropriate for manufacturers with make-to-order (MTO) environments, where demand is volatile and where different product lines share the same resources, resulting in bottlenecks. It could also be used for mixed mode manufacturing. In fact, by offering daily production planning for customer orders received, TOC enables business performance improvements in such environments in terms of lead time or cycle time reductions, increased throughput and sales, service level improvements, and inventory level reductions.

Thus, despite the fact that many people immediately invoke a vision of kanban when lean manufacturing is mentioned, TOC supports a lean philosophy where there is a complex environment. However, where lean planning focuses on the flow and the takt of the flow through the factory, TOC optimizes the flow through the factory by focusing on planning the takt of the flow through the bottleneck. TOC is also consistent with lean manufacturing in that both kanban, which is a part of the just-in-time (JIT) philosophy, and drum-buffer-rope (DBR), which is a part of the TOC philosophy, represent synchronized and pull signal production control approaches.

For an exhaustive discussion of lean manufacturing in previous notes, see Lean Manufacturing: A Primer, Lean Tools and Practices that Eliminate Manufacturing Waste, How to Achieve Lean Manufacturing, Manual versus Information Technology Enabled Lean Manufacturing, Enterprise Resource Planning Vendors Address Lean Manufacturing, and The Theory of Constraints Enters the Lean Manufacturing Arena.

The TOC Vernacular

More similarities between TOC and lean can be extracted by analyzing some TOC definitions. For example, in the TOC lingo, throughput is the rate at which the system generates money through sales. In other words, throughput is production that can be invoiced—only monetized sales generated by the system get counted. Building inventory (just for the sake of stocking up), on the other hand, is not throughput in TOC terms. This is consistent with lean manufacturing's focus on the customer and customer value-adding activities. Another example is TOC's definition of inventory, which includes all investments in procuring materials to meet customer demand, such as raw materials, work-in-process (WIP), finished goods, and scrap. The crucial point, however, is that, according to TOC, inventory is a liability and certainly not an asset. This is consistent with lean manufacturing's focus on eliminating waste. Finally, TOC's definition of operating expenses encompasses all the money the system spends to turn inventory into throughput, such as all employee time, depreciation, etc. Therefore, TOC focuses on increasing throughput, while reducing inventory and lowering operating expenses. A TOC cost and managerial accounting system thus logically accumulates costs and revenues into these three areas.

The TOC accounting system is somewhat similar to activity-based costing (ABC), since it does not create incentives (through allocation of overhead) to build up inventory. It is considered to provide a truer reflection of actual revenues and costs than traditional cost accounting. Since it is closer to a cash flow concept of income, TOC accounting provides a simplified and more accurate form of direct costing, one that subtracts true variable costs (those costs that vary with throughput quantity). Also unlike traditional cost accounting systems, in which the focus is generally placed on reducing costs in all the various accounts, the primary focus of TOC accounting is on aggressively exploiting constraints to make more money for the firm. Similarly, TOC's goal is to maximize throughput on the bottleneck, which is equal to the profit, since, according to Goldratt et al's 1984 blockbuster business novel, The Goal, "an hour lost on the bottleneck is lost forever and an hour saved on a non-bottleneck is a mirage."
In practice, TOC is implemented by following the subsequent five straightforward steps.

1. Identify the constraints. This should not be too difficult, since large piles of WIP are very noticeable, and every plant supervisor should know intimately the sore spot or bottleneck within the plant.

2. Exploit the constraint. One has to maximize the possible amount of work going though the constraint, while ensuring that there is an uninterrupted flow of work coming into the constraint, so that it never has to wait for work (i.e., an inventory buffer is kept in front of the bottleneck to ensure that it is never idle).

3. Subordinate everything else to the constraint. Since the efficiency at other resources does not really matter, there is no point in upstream work centers producing more work than the constraint can absorb. It is sufficient to provide an indication of the task priority of other non-bottleneck resources, since the utilization of non-bottlenecks is determined by the critical bottleneck.

4. Elevate the constraint. If possible, increase the capacity of the constraint by offloading some work, subcontracting work, adding more capacity (by buying more machine, adding another shift, etc.), and so on. 5. Repeat the entire process for continuous improvement. This is another similarity with the lean philosophy. It is likely that elevating the constraint will stop it from being a constraint, but a new constraint will come to light. One then has to exploit, subordinate, and elevate this new constraint.

DBR Explained

The DBR process is used within TOC to manage resources in order to maximize throughput. In simplified terms, throughput becomes the critical index of production performance. The barrier to maximum throughput is typically thwarted by a single capacity-constrained resource (CCR), or bottleneck, so the focus is on maximizing utilization of that bottleneck.

The term drum-buffer-rope encapsulates the main concepts of DBR. The drum refers to the rate or pace of production set by the system's constraint. The buffers establish protection against uncertainty (e.g., machine breakdowns, material shortages, labor problems, etc.), so that the system can maximize throughput. The rope is a communication process from the constraint to the gating operation that checks or limits material released into the system to support the constraint (i.e., a sort of a pull system, which is yet another similarity with lean).

In TOC, the constraint is viewed as a drum, and non-constraints are, according to Dr Eliyahu Goldratt, like soldiers in an army who march in unison to the drumbeat—that is all the resources in a plant should perform in unison with the drumbeat set by the constraint. In this regard, one should note that the system constraint may be either internal or external. In fact, Infor reveals that the vast majority of its customers who have implemented the lean and TOC approach have discovered, once the work flow has been corrected, that the market becomes the constraint. Other constraints to throughput include resources, materials, and, most insidiously, management.

Thus, DBR begins by identifying a critical bottleneck, which is the strategic drum or synchronous control point. The drum schedule for the plant, which sets the pace for the entire system, must reconcile customer requirements with the system's constraints. Other resources may be a temporary bottleneck for a short period depending on the order mix. Market pull is scheduled on the drum, and material is released onto the floor at the rate that the drum can operate. This rate is the rope, which consists of the minimum set of instructions to ensure that non-constraint resources are used (and not over-activated or misallocated). Material is consequently released into the system and flows to the buffers in a way that supports the planned overall system throughput. In fact, material release occurs a set buffer time ahead of demand, so that some buffer physical inventory (but not too much) is present at the drum resource to guarantee its performance in order to plan against uncertainty. In TOC, buffers can be either time or material to support throughput or due date performance. They can be maintained at the constraint, convergent points (with a constraint part), divergent points, and shipping points.
Enterprise systems come in handy when calculating complex TOC algorithms, such as, for example, defining the planned start and stop time per order down to the minute, or determining the production rate for the entire factory. A system such as Infor's Easy Lean/DBR system can manage internal constraints, time buffers, and replenishment or kanban buffers. Users can thereby execute operations on the bottleneck according to the planned start time. In addition, the priority on each operation and remaining buffer levels can be visualized—the earliest start time of the buffer indicates how realistic the plan is, while the remaining buffer controls execution priority depending on when it is planned on the bottleneck. As with kanbans, the rule of thumb is to start with a large buffer size and keep reducing it until one has a smooth flow, since the smaller the buffer sizes, the shorter the lead times and the faster the production flow.

When it comes to the execution on non-bottleneck resources, this can be done by indicating the remaining buffer in the system using red, yellow, and green buffer flags. Red flags indicate the highest priority tasks that should be focused on, while yellow designates less critical tasks, and green denotes tasks that are in the buffer and thus still in "good shape". Operators use these flags to execute tasks according to the priority level, rather than according to a defined order sequence and specific times as in material requirements planning (MRP) or advanced planning and scheduling (APS) systems. This gives operators more flexibility and the ability to make some decisions about which task to execute next. This can increase the motivation level, and is in tune with lean philosophy's employee empowerment mantra.

Yet another thing that differentiates TOC systems is the fact that, since inventory is only held in front of the critical bottleneck, it is normal for the company to end up with significantly less inventory in a TOC system than when using MRP or JIT. WIP inventory is often lower than those of kanban systems because aggregating the buffers offers the same protection overall, while simultaneously reducing the amount of protection required. Shorter production cycle times also have a similar result.

The Theory of Constraints Enters the Lean Manufacturing Arena

Just as manufacturing realities are continuously changing, so is lean thinking evolving. For example, traditionally, given competitive realities, it was almost exclusively automotive companies that deployed lean techniques such as kanban and sequencing. Today, however, there are some strong indications that only one in five companies using kanban are in the automotive industry.

This is Part Six of a multi-part note.

Customarily, lean endeavors lead almost inevitably to flow manufacturing, focused factories, or cellular manufacturing. This is because a key focus of lean is to do only what is needed. It is the polar opposite of the traditional economies of scale, with their large batch approach and resulting long lead times and bloated inventory levels. Large lot sizes are a way of compensating for the fixed cost of a process, such as changeover or set-up costs, transaction-level costs (e.g., releasing orders, issuing parts, closing and reconciling orders, moving product batches into stocks, etc.), and other per order factors. With a large run, these costs can be distributed over a larger number of units, and thus become a smaller cost on a per piece basis. As long as changeover costs are high, small lot quantities are not cost efficient or justifiable. The obvious solution, then, is to lower or eliminate these fixed costs as much as possible so that smaller runs become feasible.

It is this type of thinking that results in production lines that are designed so that there is little or no cost to change from one product to another. This means that a lot size of one (or only a few) is as economical as a large lot on less efficient, non-lean premises. But to achieve this, it is often necessary to restrict the range or variety of product processed in a given cell. Thus, despite those who proclaim that flow manufacturing principles can be implemented successfully regardless of the industry, type of manufacturing environment, or product volumes, the concept has not been all things to all people so far. There are many instances where either flow manufacturing is not appropriate or it is simply not affordable for companies to rearrange their facilities to accommodate the convenient movement of work from one resource to the other.

In fact, manufacturers need to do quite a lot of preliminary work, such as adapting their plants to a flow production model, before even thinking about deploying demand-driven manufacturing software. In other words, they will have to operate in work cells that build families of products, rather than in functional work centers that produce large batches of components or products. They will also need established rules for sending replenishment signals to their internal (i.e., preceding work station) and external suppliers. By establishing time-based process families (and techniques similar to pitch) and monitoring resource loads routinely, there could be a relatively rapid and significant reduction in manufacturing cycle time and a corresponding improvement in delivery performance and productivity, even in job shop environments. Still, these changes will not happen overnight, and the process should begin with the conversion of a few appropriate products with relatively simple production processes, and then progress to other product lines. The implementation of such changes explains why many manufacturers happen to be in a hybrid production mode, with part of the plant running according to flow principles and the rest using traditional material requirements planning (MRP) methods.

For some companies, however, there is simply not enough product similarity to make even this practical. It is challenging or even unsuitable to deploy flow or cells in a job shop that makes highly configured-to-order (CTO) or engineered-to-order (ETO) products with high setup times and long lead times. These companies might still appreciate kanban replenishment and demand smoothing, but not line design and standard operation procedures (SOP) or operation method sheets (OMS), since these features would not bring much benefit, if any, to ETO manufacturers. However, such companies' product families often include products that require one or two unique and expensive components in addition of their share of common parts, which could benefit from flow methods of smoothing spikes in demand.

In fact, with appropriate changes in workflow management and the appropriate software to help manage the approach, even companies operating in particularly complex environments can realize significant benefits. For instance, smaller make-to-order (MTO) companies, those that make large or complex products in small quantities or one-at-a-time, and those unwilling or unable to rearrange the plant for flow manufacturing should still be able to reap the primary lean benefits of smaller lots, shorter lead times, reduced inventory and work-in-process (WIP), and higher quality. Infor, for example, claims that dozens of its customers, operating in a similar environment, have had significant improvements in performance and profitability within two or three months. As long as management buys in, the methods and the tools are available. There is some relativism to be considered, however, since the improvements may not be on par with those obtained by high volume, repetitive manufacturers. Nonetheless, relative to industry competition, results could be quite impressive.

In a nutshell, flow systems cannot handle demand variability, variable product mix, shared resource constraints, or complex products with long lead times. This limits flow's applicability to items where variability is only at the end item mix, and not with frequent content variations of option mixes. For this reason, as well as all the above reasons, most manufacturers implement this method gradually and use flow manufacturing to make one product family at the time. This necessitates the use of enterprise resource planning (ERP), MRP, or advanced planning and scheduling (APS) for the rest of the business (see Best Manufacturing Scheduling Systems).
While flow manufacturing may have limits in terms of the complexity it can handle, MRP is not without its drawbacks either. MRP is a set of techniques that uses bills of material (BOM) data, inventory data, and the master production schedule (MPS) to calculate requirements for materials so as to make recommendations to release replenishment orders for materials. Because MRP is time-phased, it makes recommendations to reschedule open orders even when due dates and need dates are not in phase. MRP will, by default, create orders with specific due dates for products. Consequently, to manufacture these orders, companies prioritize resources based on these calculated due dates. The unfortunate result is that other orders, perhaps more important orders, are neglected, which often leads to overtime in the factory. Therefore, slack needs to be built into the schedule through conservative, often unjustifiably pessimistic lead times.

Combined with information from actual customer orders, MRP is still the tool most widely used in manufacturing industries to track, monitor, and order the volumes of components needed to make a certain product. However, for the above reasons, many manufacturing environments have discovered that MRP has trouble controlling stock levels, which results in poor delivery performance.

Moreover, MRP is incapable of handling demand-driven, ever-changing manufacturing, since it works especially well when demand for a particular product is constant and predictable. If there is any variation in demand, however, MRP loses many of its advantages and the benefits of using alternative planning approaches increase. In fact, the main flaw with MRP is that it is too deterministic—it does not allow for the natural variation that occurs in real life (e.g., people get sick or go on strike, trucks or shipments get delayed, machines malfunction, quality issues require scrap or rework, and customers do not always, if ever, order according to forecasts). In other words, MRP is a static model of a stochastic reality. Manufacturing requirements change all the time, according to customer orders, available parts, and so on; thus, MRP attempts to apply a high degree of precision to something that is inherently imprecise.

However, Just-in-time, Lean, and Flow Are Not Universal Panaceas

The challenge in using lean and flow manufacturing as a panacea for the shortcomings of MRP is often in setting the number of kanban cards in the system and the size of the kanban. Even with a help of computerized systems, this can become complex if the demand for each product varies significantly and the production layout is not line- or cell-based.

The just-in-time (JIT) approach normally begins with limiting inventory in the system using a two-bin kanban method. This prevents the shop floor from being flooded with inventory and WIP, and the result is shorter production cycle times and improved inventory control. With JIT, production planning centers efforts on takt time, and the result is that production volumes are determined by the market rate of pull. Process improvement is achieved by gradually reducing kanban size and monitoring decreased inventory, but JIT is useful mainly where demand is relatively stable and there is single-piece flow production feasibility.

In more of a job shop environment, however, the kanban JIT approach no longer makes sense, since the product mix by product type, routings, and process times becomes widely divergent, causing the prediction of kanban sizes to be impractical and temporary, or wandering bottlenecks to appear all over the shop floor. Indeed, where the order mix changes or not all resources are dedicated to lean flow manufacturing, then kanban sizes must continuously be reevaluated. In these situations, a theory of constraints (TOC) approach is often more appropriate.


Manual versus Information Technology Enabled Lean Manufacturing

This is Part Four of a multipart note.

The trouble is further compounded by the army of software providers (including enterprise resource planning [ERP], supply chain management [SCM], manufacturing execution systems [MES], and product lifecycle management [PLM] providers, as well as best-of-breed, bolt-on lean specialists) that have been hyping their lean capabilities, despite the fact that most of them still support mere nuggets of pseudo-just-in-time (JIT) ways of accommodating mass customization. Providing only support for kanbans, order-less repetitive scheduling, or vendor managed inventory (VMI) or supermarkets, so as to push inventory elsewhere (e.g., onto suppliers) rather than to reduce it across the entire supply chain, is a far cry from true support for lean or demand-driven manufacturing. Where most of these flow manufacturing, lean ERP, or repetitive manufacturing systems fall short is that they have simply automated the most basic of tasks within a lean environment, without addressing larger issues of how to implement lean and pull practices in environments that are not easily amenable to these.

Then again, some people question whether computer systems are even needed for achieving lean manufacturing. After all, some lean tools entail merely physical processes and best practices on the shop floor, where transactional enterprise systems have little to offer. Also, given that computers were not widely available when lean manufacturing and kanbans first emerged, many enterprises have stuck with manually-driven lean methods. For such methods, an evolutionary step forward entails the use of custom spreadsheets and reports to support lean functions such as kanban management and heijunka calculations (see Lean and World Class Manufacturing and the Information Technology Dilemma—The Loss of Corporate Consciousness). It is interesting to note, however, that even in such cases, material requirements planninng (MRP) systems still can be used to hold core master data on items and bills of material (BOM), though these records have to be tweaked with an eye toward lead time-oriented information.
Some lean purists go even further, and believe that lean manufacturing does not mesh well with information technology (IT) systems. For some, the only appropriate technology is Microsoft Excel spreadsheets. Others claim that the best scheduling method is "no schedule at all", giving the lean enterprise the utmost agility to react to any unpredictable event. On the other extreme, many people have become so accustomed to the use of enterprise systems, that they believe we can no longer return to manual procedures (see Run your Business with No Software!).

As usual, the truth might be somewhere in a middle—lean manufacturing and IT are not in opposition, and all good lean systems have both physical systems in the plant and near real time IT backbones that centralize data, especially if there is an automatic data entry and capture function. In fact, some people say that the whole point of the lean philosophy is to simplify the physical processes so that one does not need to manage overly complex data systems, though it is still necessary to manage the relevant data at the points where corrections are needed. To that end, many IT systems are designed to bring from the field only the data that management or decision-makers can do something about.

The reality is that most companies operate in a hybrid, mixed-mode environment where flow or lean and traditional batch or push manufacturing models coexist within the same facility, and where production and demand requirements can change throughout the different stages of a product's life cycle. Manufacturers can produce both high-volume goods with steady demand and low-volume goods with fluctuating demand, and their product mix may include engineer-to-order (ETO), make-to-order (MTO), and make-to-stock (MTS) items.

To successfully operate in this mixed-model environment, one has to take advantage of the strengths of each model and apply them where best suited. Thus, one should use traditional ERP systems for handling long lead-time items, one-of-a-kind production, and products with long production cycles, and for long-term budgeting and planning. On the other hand, lean manufacturing is often more easily applied to manufacturing operations with low-mix, high-volume, make-to-demand products. Moreover, one should not necessarily preclude pull-based execution processes from being implemented in to-order or highly configured operations, where it has also occasionally been done with great success.

Also, as lean spreads beyond the relatively stable manufacturing environment it was originally designed to support, companies realize that IT can play a vital role in streamlining the supply chain (see Moving Beyond Lean Manufacturing to a Lean Supply Chain). Namely, while the lean factory may use kanban pull signals to move product more efficiently through the manufacturing process and out of the door, it is missing the feedback loop from the factory to other functional departments within the organization or to the entire supply chain. That information is primarily transmitted and received via enterprise systems.
So, how can IT support lean manufacturing? For one, while complex packaged enterprise (ERP, SCM, etc.) systems may seem inconsistent with the simplicity of visual control, they actually work well together. In fact, although visual signals, such as kanbans and status indicator lights, are an effective way to trigger factory floor activities and the movement of materials, their inherent weakness is their lack of memory—visual signals cannot be recorded or tracked to determine historical performance or provide real time status for anyone that is not in direct view.

Yet, by coupling visual controls with real time collection of data from the factory floor, manufacturing enterprises should be able to capture the critical information behind the visual control signals for management oversight, planning, and accounting purposes. This information can be used for statistical analysis, to measure historical performance, and to monitor status—all of which are essential elements of the continuous improvement that lean manufacturing emphasizes. Lean aspiring manufacturers can also use enterprise systems to replace some visual controls, such as physical kanban card signals, with electronic ones, as a way to improve efficiency further and eliminate non-value adding activities.

Furthermore, these systems can play a critical role in establishing and ensuring standardized work. This is because they can serve as the central repository for critical engineering or product data management (PDM) information for standardized work, including BOMs, process routings or operations, valid product configurations, work instructions or SOPs, engineering change notices (ECN), schedule information, and costs. More robust solutions can even track as-designed, as-built, and historical actual product information, which can be analyzed to determine the impact that product changes have on efficiency and productivity.

Manual versus Information Technology Enabled Lean Manufacturing

It is easy enough to grasp the potential benefits of lean manufacturing (see Lean Manufacturing: A Primer, Lean Tools and Practices that Eliminate Manufacturing Waste, and How to Achieve Lean Manufacturing), but selecting the most appropriate lean techniques or tools and the accompanying packaged enterprise software for an individual enterprise has never been that simple. In fact, it is a major exercise for an enterprise to initially identify the most appropriate tools for eliminating different types of waste. For instance, overproduction could be mitigated by improved changeover times and balanced lines, whereas defects and rework could be curbed by improving visual controls, initiating more complete standard operation procedures (SOP) or operation method sheets (OMS), and implementing mistake proofing techniques at the source of error. Furthermore, waste of excessive inventory could be reduced by implementing kanbans and other similar pull systems, while waiting time could be handled by using takt times, and so on.

This is Part Four of a multipart note.

The trouble is further compounded by the army of software providers (including enterprise resource planning [ERP], supply chain management [SCM], manufacturing execution systems [MES], and product lifecycle management [PLM] providers, as well as best-of-breed, bolt-on lean specialists) that have been hyping their lean capabilities, despite the fact that most of them still support mere nuggets of pseudo-just-in-time (JIT) ways of accommodating mass customization. Providing only support for kanbans, order-less repetitive scheduling, or vendor managed inventory (VMI) or supermarkets, so as to push inventory elsewhere (e.g., onto suppliers) rather than to reduce it across the entire supply chain, is a far cry from true support for lean or demand-driven manufacturing. Where most of these flow manufacturing, lean ERP, or repetitive manufacturing systems fall short is that they have simply automated the most basic of tasks within a lean environment, without addressing larger issues of how to implement lean and pull practices in environments that are not easily amenable to these.

Then again, some people question whether computer systems are even needed for achieving lean manufacturing. After all, some lean tools entail merely physical processes and best practices on the shop floor, where transactional enterprise systems have little to offer. Also, given that computers were not widely available when lean manufacturing and kanbans first emerged, many enterprises have stuck with manually-driven lean methods. For such methods, an evolutionary step forward entails the use of custom spreadsheets and reports to support lean functions such as kanban management and heijunka calculations (see Lean and World Class Manufacturing and the Information Technology Dilemma—The Loss of Corporate Consciousness). It is interesting to note, however, that even in such cases, material requirements planninng (MRP) systems still can be used to hold core master data on items and bills of material (BOM), though these records have to be tweaked with an eye toward lead time-oriented information.
Some lean purists go even further, and believe that lean manufacturing does not mesh well with information technology (IT) systems. For some, the only appropriate technology is Microsoft Excel spreadsheets. Others claim that the best scheduling method is "no schedule at all", giving the lean enterprise the utmost agility to react to any unpredictable event. On the other extreme, many people have become so accustomed to the use of enterprise systems, that they believe we can no longer return to manual procedures (see Run your Business with No Software!).

As usual, the truth might be somewhere in a middle—lean manufacturing and IT are not in opposition, and all good lean systems have both physical systems in the plant and near real time IT backbones that centralize data, especially if there is an automatic data entry and capture function. In fact, some people say that the whole point of the lean philosophy is to simplify the physical processes so that one does not need to manage overly complex data systems, though it is still necessary to manage the relevant data at the points where corrections are needed. To that end, many IT systems are designed to bring from the field only the data that management or decision-makers can do something about.

The reality is that most companies operate in a hybrid, mixed-mode environment where flow or lean and traditional batch or push manufacturing models coexist within the same facility, and where production and demand requirements can change throughout the different stages of a product's life cycle. Manufacturers can produce both high-volume goods with steady demand and low-volume goods with fluctuating demand, and their product mix may include engineer-to-order (ETO), make-to-order (MTO), and make-to-stock (MTS) items.

To successfully operate in this mixed-model environment, one has to take advantage of the strengths of each model and apply them where best suited. Thus, one should use traditional ERP systems for handling long lead-time items, one-of-a-kind production, and products with long production cycles, and for long-term budgeting and planning. On the other hand, lean manufacturing is often more easily applied to manufacturing operations with low-mix, high-volume, make-to-demand products. Moreover, one should not necessarily preclude pull-based execution processes from being implemented in to-order or highly configured operations, where it has also occasionally been done with great success.

Also, as lean spreads beyond the relatively stable manufacturing environment it was originally designed to support, companies realize that IT can play a vital role in streamlining the supply chain (see Moving Beyond Lean Manufacturing to a Lean Supply Chain). Namely, while the lean factory may use kanban pull signals to move product more efficiently through the manufacturing process and out of the door, it is missing the feedback loop from the factory to other functional departments within the organization or to the entire supply chain. That information is primarily transmitted and received via enterprise systems.
So, how can IT support lean manufacturing? For one, while complex packaged enterprise (ERP, SCM, etc.) systems may seem inconsistent with the simplicity of visual control, they actually work well together. In fact, although visual signals, such as kanbans and status indicator lights, are an effective way to trigger factory floor activities and the movement of materials, their inherent weakness is their lack of memory—visual signals cannot be recorded or tracked to determine historical performance or provide real time status for anyone that is not in direct view.

Yet, by coupling visual controls with real time collection of data from the factory floor, manufacturing enterprises should be able to capture the critical information behind the visual control signals for management oversight, planning, and accounting purposes. This information can be used for statistical analysis, to measure historical performance, and to monitor status—all of which are essential elements of the continuous improvement that lean manufacturing emphasizes. Lean aspiring manufacturers can also use enterprise systems to replace some visual controls, such as physical kanban card signals, with electronic ones, as a way to improve efficiency further and eliminate non-value adding activities.

Furthermore, these systems can play a critical role in establishing and ensuring standardized work. This is because they can serve as the central repository for critical engineering or product data management (PDM) information for standardized work, including BOMs, process routings or operations, valid product configurations, work instructions or SOPs, engineering change notices (ECN), schedule information, and costs. More robust solutions can even track as-designed, as-built, and historical actual product information, which can be analyzed to determine the impact that product changes have on efficiency and productivity.

How to Achieve Lean Manufacturing

Since it is lean manufacturing's role to deliver value to the customer, the first step for any manufacturer that attempts to make its organization lean is to define value from the perspective of the customer, whether the end customer or an intermediate customer. This value must be identified and expressed in terms of how the specific product meets the customer's need, at a specific price and at a specific time. To do that, one has to be able to evaluate performance in terms of customers, products, profitability analyses, and so on, by measuring well thought-out key performance indicators (KPIs), such as customer sales, product sales, profitability by customer, profitability by product, etc.

Map the Value Stream

As the next step, manufacturers must identify and map those activities that contribute to value and those which do not. The entire sequence of the activities or processes that are involved in creating, producing, and delivering a good or service to the market—from design and sourcing to production and shipment—is called the value stream. For any finished good, the value stream encompasses the raw material supplier, the manufacture and assembly of the good, and the distribution network. For a service, on the other hand, the value stream consists of suppliers, support personnel and technology, the service producer, and the distribution channel. The value stream may be controlled by a single business or by a network of several businesses.

Once the activities have been identified, companies must determine what activities are value-adding, what activities are non-value-adding but essential to the business (e.g., payroll), and what activities are non-value adding and non-essential to the business. The impact that necessary, non-value adding activities have on the value stream must be minimized, while non-value adding, non-necessary activities must be eliminated from the process. To that end, a value stream mapping (VSM), which is a logical diagram of every step involved in the material and information flows from the order to the delivery of a product, can be done for the current process and the future process. A visual representation of every step in a process is thereby drawn and key data, including customer demand rate, quality, and machine reliability, are noted down.

Because VSM can lead to potential cost reductions, improved throughput, higher asset utilization, etc., some software vendors are now providing business process design, viewing and publishing application tools, and even business process reference models or templates of best practice process models, to assist with VSM creation. These tools are used to design processes, to communicate them, and to educate and work according to decided processes. By modeling the processes, one can visualize what the organization does and how it does it, as well as gain a view of responsibilities. Meanwhile, by connecting tools and documentation to processes, one can visualize which activities are performed and who or what controls them. Solutions might also include deployed functionality for publishing processes, connected applications, and documentation to an intranet-based workplace, which would ease the communication of changes throughout an organization and support employees working according to decided processes. Processes could even be published to a Web site, which would make it far easier for employees to easily access suggested or decided processes.
Once manufacturers identify value-adding and non-value-adding but necessary activities, they should then work to make these activities flow as an uninterrupted movement of products or services through the value chain to the customer. This requires manufacturers to eliminate functional barriers and to develop a product-focused organization. Dramatically reducing lead and cycle times, and eliminating work in queue, batch processing, waiting, scrap, and unnecessary transportation in this way, should lead, in the best-case scenario, to single piece flow.

Several tools can help organizations to achieve flow manufacturing, including total productive maintenance (TPM), leveled schedules (heijunka), long-term purchase agreements with suppliers, just-in-time (JIT) call-offs (possibly via electronic kanbans), and supplier managed inventory (SMI) for raw materials. In addition, though it may be surprising to some, forecasting can also be important in lean manufacturing, as it provides the basis for long-term purchase agreements with suppliers and also determines long-term capacity requirements. Forecasting also helps with generating a leveled master production schedule (MPS) based on the flow of materials through the supply chain or factory and costs. This is particularly useful where there is variable demand or new product introductions (NPI), since there may always be one major capacity constraint or minor ones may pop up here and there.

In fact, while forecasting might have had a poor reputation in manufacturing circles (particularly among those firms attempting lean), recently there has been an increased awareness that with good collaborative planning and forecasting software that supports collaborative sales and operations planning (S&OP) processes, many manufacturers could improve their business performance (see Sales and Operations Planning). Thus, as with production planning, manufacturers need to remain on top of forecasting by leveraging much shorter review intervals than the traditional quarterly (if not yearly) updates. By taking forecasting more seriously and supporting it with smart, interactive tools, all parties within the manufacturing business should be on the same page at the end of the day, which should result in increased agility. The exception, of course, is manufacturers in volatile markets or with products with short lifecycles, for whom forecasting based on history often means missing the true demand signals from customers or distribution channels.

Flow manufacturing does not address synchronizing around the supply chain, multiple partners, and suppliers, since it is merely a shop-floor execution tool. If only for this reason, enterprises should still use supply chain planning (SCP) for strategic purposes in which multiple departments (sales and operations, inventory, distribution, collaborative demand management, transportation planning, etc.) are involved, such as planning for resources across an organization, preparing for promotions, negotiating long-term contracts, establishing objectives, and coordinating multi-site operations.
After manufacturers remove waste and establish a seamless flow, they must transform into demand-driven organizations where customer demand pulls products through the value stream, driving manufacturing activity and material flow. The ultimate goal is to become so responsive that products are delivered only when the customer, internal or external, needs it (i.e., places or signals the actual order)—not before and not after, though delaying the use of material and labor as long as possible.

To that end, intuitive and visible pull signals should initiate manufacturing and movement of material. This might necessitate support for various types of pull signals or kanbans (e.g., printed-out cards or electronic signals in case of using computers). Production of product can consequently be order-less, with backflushing to report automatic material consumption of components. This should result in major reductions in work-in-process (WIP) and production cycle times. The number of cards to print and use for each item and receiving location can either be entered manually or calculated and updated automatically in an enterprise system.

Kanban cards contain somewhat differing information for production cards, which signal that the container should be filled with new parts (i.e., item number, item description, requesting location, packaging type, container quantity, and replenishment lead-time) and transportation cards, which signal that the container should be moved whether empty or full (i.e., item number, item description, sending location, receiving location, total number of kanban cards, and standard container quantity). In the so-called one-card type of kanban system, only production cards or transportation cards are used, while in the two-card type of system both production cards and transportation cards are used.

Taking it a step further, one could leverage product technology that fosters collaborative agreements with suppliers and trading partners to have signals and the requisite data collection as the actual products flow through the supply chain. This would increase transparency and avoid stock bullwhip effects across the entire supply chain. A good example would be a repetitive scheduling technique like customer delivery schedule (CDS), which supports repetitive demands for one or several items to one or several locations in one transaction for discrete times or time buckets.

Such schedule level management tools can enable users to receive demands with different levels of accuracy or validity from different partners within the customer organization, while managing different types of demands, such as consolidated customer forecasts, customer forecasts, customer call-offs and JIT call-offs, customer sales reports, and sales statistics. The tool can be either integrated with various electronic data interchange (EDI) components that support several types of EDI message standards and message types, or managed manually. Use of vendor managed inventory (VMI) or point-of-sale (POS) data on the customer side should have a similar effect.

Software vendors also offer a number of other tools to help organizations respond to customer demand. For instance, a supermarket is a tightly managed amount of inventory within the value stream that allows for a pull system. Such inventory buffers can contain either finished items or WIP. They are used to handle finished goods inventories that are replenished by a continuous flow pacemaker process, which falls somewhere between a continuous flow process and other manufacturing processes shared by other value streams, as well as for incoming parts and material being pulled from supplier locations.

Some vendors offer strong JIT call-off management functionality as a planning and execution environment for both proactive and reactive sequence deliveries, as well as for frequent electronic kanban deliveries. Using this functionality, final products are broken down into specific items, a specific end product is pinpointed and identified via a sequence number or a production identification (ID), and the sequence call-off is sent out when the final production sequence is frozen at the customer line (which may be either hours or days before the parts are needed, depending on the application).

For advanced flow-oriented planning and execution environments, support for supply in line sequence (SILS) is available. SILS promotes the use of configuration within sequence flows, which allows, for instance, the ordering of a configuration of cables for a single vehicle on the customer's production line. To be able to pack according to all the different pack demands that can be found within the SILS concept, a very flexible set of rules must support manual or automatic packing of call-off demands according to a variant of the SILS concept or the customer's demands.