Data Integration: Increasing Its Success And Roi

  • Uploaded by: expressor software
  • 0
  • 0
  • April 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Data Integration: Increasing Its Success And Roi as PDF for free.

More details

  • Words: 5,590
  • Pages: 13
WHITE PAPER

Data Integration: Increasing its Success and ROI

sponsored by expressor software www.expressor-software.com

Rick Sherman Athena IT Solutions

Data Integration: Increasing its Success and ROI

Table of Contents Business Demand for Data Integration ................................................................................................. 3  The State of Data Integration Today from a Business Perspective ................................................. 3  Business Fills the Information Gap ........................................................................................................ 4  The State of Data Integration Today from IT’s Perspective............................................................. 4  The Roadmap to Success and Business ROI ......................................................................................... 5  Business and IT Alignment .................................................................................................................. 6  Data Governance .................................................................................................................................. 6  Holistic Approach ................................................................................................................................. 7  Incremental and Iterative Approaches ............................................................................................. 7  Productive, Efficient Methods ........................................................................................................... 8  Tool-based Development .................................................................................................................... 8  Selecting a Data Integration Tool ......................................................................................................... 9  Best Fit ................................................................................................................................................... 9  Supports a Data Integration Program, not just Tactical Projects ............................................. 10  Data Governance ................................................................................................................................ 10  Scalable, Expandable and Pervasive............................................................................................... 10  Role-Based Development .................................................................................................................. 11  Cost- and Resource-Effective........................................................................................................... 11  Business ROI ........................................................................................................................................ 11  Conclusions .............................................................................................................................................. 12 

Page 2 of 13

Data Integration: Increasing its Success and ROI

Business Demand for Data Integration Businesses crave information. They need it to grow and operate the business, make it more efficient and to respond to customers. And, businesses are investing in getting that information. IDC estimates that the total business intelligence (BI) software market was just over $7 billion in 20071. A just completed Gartner research survey of over 1500 CIOs worldwide lists BI as their top priority for 20092, even as budgets are constrained. In fact, BI has been one of the top priorities for the last few years. Data integration is the driving force that enables BI. According to Gartner Research3, data integration grew to be a $1.4 billion software market in 2007; they estimate it will grow at an annual rate of 17 percent through 2012. The top business areas that data integration supports include: • • • •

Performance management – measure and better manage business processes, such as supply chains or budgeting/planning, leading to efficiencies and improved profits. Integration of customer data – improve customer experiences, identify up-sell and cross-sell opportunities, determine customer profitability and help prospect for new customers. Cost cutting – identifying expenditures and cost cutting opportunities. Regulatory compliance – support adherence to governmental and industry regulations such as Sarbanes-Oxley and Health Insurance Portability and Accountability Act (HIPAA).

The State of Data Integration Today from a Business Perspective Despite the investments in business intelligence and data integration, many companies are data rich but information poor. A survey 4 by Accenture of 1,000 managers found that 59 percent said poor information distribution results in missing information that might be valuable to their jobs almost every day, 42 percent accidentally use the wrong information at least once a week and 53 percent said less than half of the information they receive is valuable. In a Business Week survey5, 55 percent of executives or managers said it was difficult to very difficult to get relevant information to make business decisions. A similar survey6 by Accenture found that the situation had not improved, with 40 percent of decisions made by executives based on “gut” rather than analytics. Over 60 percent of those gut decision were made because good data was not available. Business people perceive that their data is inconsistent, incomplete and not current enough. The surveys state that business people spend significant amounts of time looking for and gathering data. They feel that the data is scattered and is fragmented across the enterprise in data silos that are not integrated. Many companies have multiple data warehouses and dueling data marts in addition to various enterprise applications. An Aberdeen Group research study in fall 2008 titled “One Version of the Truth 2.0: Are Your 1

IDC, "Worldwide Business Analytics Software 2008-2012 Forecast and 2007 Vendor Shares", November 2008 Gartner Research, “Gartner EXP Worldwide Survey of More than 1,500 CIOs Shows IT Spending to Be Flat in 2009” , January 14 2009 3 Gartner Research, "Magic Quadrant for Data Integration Tools" , September 22 2008 4 Accenture, “Managers Say the Majority of Information Obtained for Their Work Is Useless, Accenture Survey Finds”, 04 January 2007 5 BusinessWeek Research Services, survey of 675 US and European executives and managers 6 Accenture, "Survey Shows Business Analytics Priorities Not Yet Achieved", July 2008 2

Page 3 of 13

Data Integration: Increasing its Success and ROI Decisions Based on Reality?”7 reinforced the opinion by stating that 53 percent found that it was difficult to access and integrate data.

Business Fills the Information Gap Business people are frustrated with the time and cost of integrating data. But they cannot wait until the perfect situation occurs, so they take matters into their own hands. As mentioned above, in the absence of easily accessible relevant data, business people will gather the data on their own and make a best guess. This has resulted in ill-informed decisions, which are no fault of the business person making the decision. It has also resulted in dueling numbers in meetings and discussions where different business people will come up with different numbers on the same subject. Maybe they used two different IT applications, i.e. data silos that were not consistent; created their own spreadmart; or simply “cut” data from IT applications and “pasted” the data into their own spreadsheets. Regardless, they waste time and potentially may make an ill-informed decision because of the poor state of data integration. An even costlier result of the lack of data integration has been the rise of data shadow systems or spreadmarts. These are most often collections of Microsoft Excel spreadsheets and Microsoft Access databases used to support business processes and perform reporting and analysis. These data shadow systems can include dozens, if not hundreds, of separate files to gather, transform, load and report on data. In a report8 I coauthored for TDWI published in Q1 2008, our survey revealed that the average enterprise had a median of 30 spreadmarts that were actively used by the business. I have encountered many enterprises where tens to hundreds of business people were using a particular data shadow system for their primary reporting and analytics while only a small fraction of those people were accessing the enterprise data warehouse of their business group’s data mart. Often that situation was multiplied with various data shadow systems spread out across the enterprise, supporting different business groups and processes. The TDWI report estimated conservatively that the cost of the care and feeding of each data shadow system was $780,000 annually. That does not include the opportunity loss caused by business people’s time being diverted from making decisions to gathering data, nor does it include the cost of data errors or faulty decisions made on inaccurate data. Business groups do not want to be in the business of creating and maintaining their own data shadow systems, but business continues 24x7 whether IT is providing them with integrated data or not. Sometimes, they just have to try to do the best they can with the data that is available (integrated or not).

The State of Data Integration Today from IT’s Perspective Data integration projects are too costly, time consuming and complex. Those characteristics would be fine if the business was truly getting integrated information and there was a positive ROI, but as we have discussed, that is not often the case. What keeps IT from being successful in data integration? First, IT is faced with the “Goldilocks and the Three Bears” Syndrome: data integration is always too hot or too cold, and never just right.

7

http://www.aberdeen.com/summary/report/benchmark/5298-RA-decisions-based-reality.asp "Strategies for Managing Spreadmarts: Migrating to a Managed BI Environment", TDWI Report, Rick Sherman and Wayne Eckerson, Q1 2008

8

Page 4 of 13

Data Integration: Increasing its Success and ROI It is too hot because the “top tier” products are expensive, complex and require dedicated, highly skilled resources. The cost and resource commitments have constrained their use for creating and maintaining enterprise data warehousing (EDW) at large corporations. Even those large corporations that could afford the top-end tools have found that getting the throughput and performance out of these tools has required significant investments. These investments have been in purchasing hardware and memory along with the expertise to tune the servers, storage, networks and databases. Corporations often spend hundreds of thousand dollars on software and infrastructure when using these top-tier tools. Annual maintenance fees and upgrades, beyond labor costs, create a heavy ongoing expense. It is too cold because most of the time, with estimates from industry analysts of up to 70 percent, data integration is done with hand-coding. Even at large corporations, data marts or cubes are generally hand-coded. This includes when the EDW is built with packaged tools because there is no budget or resources available for departmental projects, or simply because the tool is overkill for the data integration tasks at hand. Smaller enterprises have historically done all their data integration by hand-coding because of the costs and resources. Up until recently IT groups have not found the “just right” tool that matches sophistication, cost and resource requirements with the intended data integration. Second, because IT has taken on data integration projects in a tactical manner, there has been overlap and redundant efforts, which leads to data silos and a much higher cost than necessary. Rather than creating a data integration program where each project leverages others’ work, these projects are stovepipes. Third, IT has too narrowly viewed data integration as simply ETL. Building a data warehouse using batch-driven ETL has defined from an IT perspective what data integration can do, but many data integration vendors support real-time updates using such techniques as messaging, EII (enterprise information integration) and through services via SOA (services oriented architecture.) With the limited viewpoint that DI is only ETL, many integration projects were done by hand-coding or by introducing narrowly focused tools rather than a full-faceted data integration tool. This has driven up the cost of integration, strained resources and created yet more data silos. It is unfortunate, but the too hot or too cold use of data integration tools has significantly inhibited data integration from being pervasive in enterprise big and small, as well as not keeping up with the information demands from the business.

The Roadmap to Success and Business ROI Business demand for information is ever increasing. Businesses are willing to invest in data integration to get the information they need to run their operations, get more customers, improve the customer experience, increase profitability, decrease costs and comply with government regulations along with privacy and security. But those investments have not yielded the information nirvana that has been promised. To meet their needs and expectations for information delivery, businesses need to take a new paradigm. Business and IT have to establish an integration roadmap involving much more than just tools, but also people, processes and procedures. It is only through this more comprehensive approach that companies will achieve successful integration with a business return on investment (ROI). The roadmap establishes the foundation for successful integration:

Page 5 of 13

Data Integration: Increasing its Success and ROI • • • • • •

Business and IT alignment Data governance Holistic approach Incremental and iterative approaches Productive, efficient methods Tool-based development

Business and IT Alignment IT and business groups need to align their integration efforts to support current and future business information needs. Almost all business initiatives require information, so integration projects need to be fundamental components of these initiatives right from the start. The first step is for IT to understand the information needs of the strategic business initiatives. Too often the first time business needs are addressed is in gathering the requirements for a specific integration project. This leads to the “not seeing the forest from the trees” syndrome. IT is so wrapped up in the details of a tactical project that they lose sight of the overall information needs. Sometimes they never even see it to begin with. Without understanding overarching information needs, IT cannot break the data silo habit that has so characterized many enterprise integration efforts. The second step is the most obvious: IT obtaining sponsorship and funding for the integration programs. Although this sponsorship is crucial, too often IT figures that by writing the check the business is aligned with what IT is doing. This far too often results in unmet business expectations because the business and IT did not continue to work together, and there is a disconnect in what is needed versus what is built. The next step is for IT to maintain a tight working relationship with the business throughout their projects. IT needs to view the business as both customers and partners. It is only through active participation in the project that IT and business can really align their interests. This participation is not just meeting to gather requirements, but proactive involvement in the intermediate results, feedback and testing throughout the project. IT needs to avoid the scenario where after gathering requirements, it works on integration in isolation from the business. The next time IT emerges is when the project is “complete.” At this point it becomes obvious that the business should have been actively involved to answer questions and provide feedback.

Data Governance One of the biggest roadblocks for businesses trying to use information is data that is inconsistent, undefined, or from an unknown source. Debates about different numbers in different reports, what does a value really mean or simply not knowing if data is available wastes business people’s time and thwarts effective decision-making. To paraphrase an old adage: “If all of your data is not part of the solution, your data is part of the problem.” Business people should be responsible for establishing what data is and defining the transformations that are needed for reporting and analytics. Too often IT takes on this responsibility without the business taking ownership. Data is a corporate asset, not an IT asset and as such the business has to assume ownership. The business often does not get involved either because IT has chosen to do it themselves, or has not adequately set up the processes and established the working relationship for the business to get involved.

Page 6 of 13

Data Integration: Increasing its Success and ROI This “data governance” is not just a one-time task; it’s an ongoing process of adding or modifying data definitions and transformations matching changes in the business. The business owns the definitions while IT is responsible for implementing those definitions and transformations in their data integration and business intelligence efforts. IT needs to establish expectations and processes for ongoing data governance. With government and industry regulations, business people should be eager to get involved, but it is up to IT to create the framework for this to occur.

Holistic Approach Data integration and business intelligence projects generally are undertaken as tactical projects independent from other projects. Creating discrete projects that can be scoped and completed is a proven project management approach. However, the danger is that discrete becomes disjointed. Separate projects can either perform redundant tasks or, even worse, contradictory work. Each project looks at its needs and builds accordingly. Years ago telecommunication and e–mail systems were often purchased and deployed by multiple groups within an enterprise. At some point the enterprise realized these systems were part of an enterprise backbone that needed to be looked at as a whole and then deployed to support enterprise-wide and group specific needs. Without this holistic view enterprises were generally paying too much in cost, time and resources deploying overlapping and sometimes incompatible systems. Likewise, data integration needs to be architected in a holistic manner enabling an enterprise-wide solution that supports specific business group’s needs without costly overlaps and inconsistencies. Data integration needs to be architected in a top-down fashion designing the supporting information, data, technology and product components while being implemented in a bottom-up fashion, i.e. project-by-project.

Incremental and Iterative Approaches Although it is imperative to take a holistic approach in designing an overall architecture, the potential trap is viewing that architecture as the end goal. The real goal is to transform data into business information, but sometimes technically-oriented folks lose sight of that. The architecture is a means to an end, not the end itself. Building the architecture blueprint and implementing an enterprise data integration solution should be both incremental and iterative. Incremental: Don’t “boil the ocean.” Most enterprises do not have the time, money or resources to build out an enterprise integration solution in one grandiose project. Instead, incrementally building out the architecture a project at a time allows a more manageable and practical method to achieve the architecture. The individual projects should be viewed as incrementally building the enterprise data integration portfolio. With an overall architecture, each tactical project can be designed to fill in more pieces of that portfolio in an orchestrated fashion rather than just hoping it happens. Iterative: Building out the architecture iteratively offers the opportunity of discovery and learning through individual projects. Neither the business nor IT will begin with complete knowledge of current or future information needs as you start your integration efforts. The architecture and implemented solutions need to evolve iteratively to incorporate discovery and leanings, along with adapting to changing business needs. Thinking that you know everything or being inflexible are traps that will sink your data integration investments.

Page 7 of 13

Data Integration: Increasing its Success and ROI Productive, Efficient Methods Data integration is most successful when you use productive, efficient methods. Three of these methods are reuse, document, and make it auditable. Re-use: Re-using data definitions and transformations is not only the most productive method to build an enterprise-wide solution, but also the surest method to ensure consistency. Why re-invent the wheel with every integration project when you can leverage past work and ensure consistency? This seems like a no-brainer, but only if the definitions and transformations from projects are readily available. Document: Documentation enables the identification of data that has been defined before and is a candidate for re-use. This documentation, whether it is stored in your data integration tools’ metadata or simply in a Microsoft Office document, needs to be created and maintained for all data integration projects. Each project needs to be able to leverage other project’s leanings, and should make its own sharable as well. Auditable: With the rise of various government regulations and industry standards such as Sarbanes-Oxley and HIPPA, being able to audit your data integration is not just a “nice to have,” but a business necessity.

Tool-based Development Historically the cost, complexity and skills needed to use data integration (DI) or ETL tools has resulted in custom (or hand) coding becoming the de facto method to develop data integration processes. This has been a tale of the rich versus the rest. Large corporations with significant budgetary commitments have adopted tool-based development as a standard, while everyone else, including other projects in the same companies, has opted for hand-coding. The benefits of tool-based development versus custom-coding include: •

Productivity gains from pre-built functions or transformations supplied by ETL or DI tools that are used in data integration processes. ETL tools provide many of the functions and transformations that are used repeatedly in data warehousing, business intelligence and data migration applications. Not only does hand-coding require these to be built from scratch (delaying the project and potentially introducing errors) but the hand-coders may not even be aware of the best practices in regards to those functions.



Data integration tools provide workflow and error recovery functionality, enabling robust and production-ready data integration. ETL tools generally track operational processing, enabling not only recovery and status notifications but also the ability to analyze performance to identify bottlenecks and interactions with other applications. With hand-coding all this has to be coded in addition to the basic data integration processing, incurring a heavy cost in time and resources.



ETL tools provide impact analysis and where-used functionality. This enables change management, improves productivity and reduces error surprises. With hand-coded applications it is often a guessing game as developers have to search through pages of code and hope that they are even accessing all the code that is being used.

Page 8 of 13

Data Integration: Increasing its Success and ROI •

ETL tools create current documentation on-demand. The documentation (if it’s even created) associated with hand-coding is generally incomplete and out-ofdate. Documentation provides auditing and fosters understanding of the data both for the business person and the IT developer.



Data integration tools that provide repositories and semantic dictionaries enable the reuse and easy maintenance of business and technical definitions, often referred to as metadata. Although this functionality may be “hidden under the covers” it provides the significant benefit of helping each subsequent data integration project leverage existing work, thus speeding up development time (time-to-market) and reducing business errors.

The fact that projects with sufficient resources, i.e. budget and skills, almost always use ETL tools make their use compelling -- if you have the money. The good news for everyone who has felt they could only afford a custom-coded solution is that there are now ETL or data integration products whose cost can be justified for most projects.

Selecting a Data Integration Tool IT groups create their shortlist of ETL/DI vendors based on either the current market leaders or on industry analyst groups’ top ranked vendors. But no short-list comes into play with custom-coding, which is the predominant method used for data integration. It appears that there is a major disconnect between the current shortlisted vendors and the needs of many enterprises. The criteria to use when selecting a data integration tool includes: • Best fit • Supports programs not just projects • Data governance • Scalable, expandable and pervasive • Role-based development • Cost- and resource-effective • Business ROI

Best Fit If we used the same logic in selecting and purchasing a calculator as IT groups do for ETL tools then all of us would walk out of the office supplies store with a scientific, graphing calculator. Why? Because those calculators are the most feature-laden, and therefore the “best.” But most of us would not know how to use most of that functionality and we should not be forced to buy something we will not use. Almost all of us purchase a simple calculator that can do the basic calculations like balancing our checkbook or calculating taxes or tips. We buy the best fit for us, not what is the “best,” i.e. most feature-laden calculator. The “best” ETL products are great, but they are generally expensive, complex and require a greater investment (time, money and skills) than other products, which also may do everything you want but at a potentially significantly lower total cost of ownership (TCO). When an enterprise selects a data integration tool, they have to determine what they need, how much they are willing to spend and what skills they have to implement the software. Getting a product that is too complex for your staff or the tasks that need to be done means the product will likely become shelf-ware. Match your integration needs, skills and budget to the tool, not the other way around.

Page 9 of 13

Data Integration: Increasing its Success and ROI A good test of the best fit is to run a proof-of-concept (POC) or prototype with your selected DI / ETL tool using your own data.

Supports a Data Integration Program, not just Tactical Projects An ETL tool that supports multiple projects and developers spread out over a data integration program delivers significant business and IT benefits. This capability facilitates the re-use of data definitions and transformations amongst projects and collaboration between developers. From a business perspective, this also improves data consistency and reduces the likelihood of creating data silos. From an IT perspective, re-use and collaboration boosts productivity and makes it easier to bring additional people onto each project because knowledge is not locked into a programmer’s “proprietary” custom code. Data integration tools foster sustainable software development and maintenance practices such as source control, revision management and functional sharing. With the importance of data and its integration to an enterprise, it is often too risky to rely on manually creating, documenting and managing custom code, not to mention that is more costly, time consuming and error-prone.

Data Governance As already discussed, a successful data governance program obtains business ownership of the data definitions and transformations, which is vital to implementing consistent information throughout an enterprise. It is important to have a data integration tool that can then support those definitions and transformations so that the processes do not just become a paper exercise that is not implemented in the integration processes. In addition, role-based (below) support of various data governance activities will improve responsiveness to business needs and the demands placed on changes in the business.

Scalable, Expandable and Pervasive Enterprises, from Fortune 1000 to small to medium businesses (SMB), typically need to integrate many varied data sources. These sources are spread across many platforms; stored in structured and semi-structured data types; have different update cycles; are interconnected and co-dependant; and, need batch or real-time processing. In short, most enterprise data integration needs are diversified and challenging. When selecting a data integration tool, it must be able to handle this diversity. In addition to supporting varied data sources, it is highly desirable for a data integration tool to be able to optimize data processing, particularly with parallel capabilities. This functionality leverages whatever size infrastructure footprint is used for data integration by maximizing data throughput. This optimization saves money by reducing the infrastructure investments that would otherwise be required and helps those committed to greener IT. In addition, optimization results in shorter integration processing, which in today’s business climate is always demanded. Historically, integration efforts have been silo’d across an enterprise because of the varied requirements discussed above. Because ETL products often supported only batch-driven processes primarily accessing relational databases and flat files, their focus was generally data warehousing and business intelligence. With these diverse integration needs, select a data integration product that goes beyond DW or BI and that enables: data migration, data synchronization, service oriented architecture (SOA), data quality, enterprise data access,

Page 10 of 13

Data Integration: Increasing its Success and ROI complex data exchange and master data management (MDM). An enterprise’s initial integration efforts may be focused on DW and BI, but many of these other areas will arise in the future and it will be desirable to simply expand your data integration platform rather than building a whole new, and likely overlapping, integration platform. To make sure your data integration tool is widely used in the enterprise, be sure to choose one that supports diverse sources, data processing models and applications. A word of caution: Watch out for data integration tools that have been built through acquisitions and bolting on products. These tools become overly complex and difficult to use.

Role-Based Development There is a natural tendency to equate ETL tools with developers, but that is too shortsighted. Look for a data integration tool that supports the entire data integration lifecycle from design, development, testing, production and maintenance. Besides the developer, the tool’s lifecycle should include the project manager, architects, analysts, data stewards, testers, production support staff and even the business person using the information. The data integration tool needs to enable processes facilitating each of these roles in your integration program. With narrowly focusing ETL as a coding activity, the program loses sight of the continuous role of the business and IT in defining data and transformations. Each of the roles and tasks they perform should be automated and incorporated in the tools’ processes. This increases productivity, reduces the time to deploy new integration projects, encourages re-use and increases data consistency.

Cost- and Resource-Effective ETL tool-based development has historically been viewed as too expensive and hard to learn, as well as taking too much time on each project to develop the integration processes. The market “leaders” started as batch-only tools for DW development. Expanding this functionality required incorporating software from acquisitions or altering existing architectures with the new functionality while trying not to disrupt current customer implementations. As a result, customers deployed these complex products for uses that didn’t require complexity. And, they were still paying mainframe-level prices – a big blow to their ROI. The good news is times have changed. Newer data integration tools have been architected from the ground up to support a diversity of data source and integration styles. This means that a development team should be more productive sooner and be able to re-use these tools in various situations. In addition, newer pricing models bring these tools within most enterprise’s reach. In many cases, more expensive products are not necessarily the best choice for an enterprise. Many enterprises in the past have “blown their budget” on the ETL tool, and then had no budget to finish the project. Make sure the product you select is affordable and provides the functionality needed by your enterprise.

Business ROI No matter how successful your integration efforts appear, it is important to determine the business ROI that your data integration efforts have achieved. The business benefits will be both quantitative and qualitative.

Page 11 of 13

Data Integration: Increasing its Success and ROI Qualitative benefits abound from data integration projects. Being able to access consistent, comprehensive, clean and current information reduces guess work, enables better analysis and improves decision-making. Business productivity improves by shifting employees’ time from gathering data to analyzing and acting on that data. IT productivity gains by placing the right tools to create the integration processes. It also helps the business meet various government and industry regulations. All are enabled by an enterprise’s data integration program. Sometimes it is more difficult for the business to quantify business benefits. The business benefits of data integration enable improved decision-making, leading to increased sales, improved customer service, increased profitability or reduced costs -- but there are other processes involved. Data may be the key ingredient to performance management systems, but there have to be business processes in place to leverage that knowledge that, in turn, produced the business benefit. The trick is to not claim that data integration alone created this business benefit, but that that the data-enabled solution did. With this mindset, one should be able to quantify the benefits from the various data-enabled solutions deployed in the enterprise.

Conclusions There has been an ever increasing demand for information in enterprises across industries of all sizes. Despite significant investments in data integration and business intelligence this demand has not been met. Business people have to use data silos and data shadow systems to get their information, resulting in lost time gathering the data, debating the numbers and guessing what the right numbers are. Enterprises need to take a new approach to information and data integration. First, business and IT people need to work to jointly to take ownership and responsibility for data and its integration. Second, they need to establish processes to build a data integration portfolio within a holistic information architecture. Finally, IT needs to rethink its approach to data integration by selecting the best fit technology with an affordable TCO. With this new paradigm, data integration efforts will be successful and return a solid business ROI.

Page 12 of 13

Data Integration: Increasing its Success and ROI

About the Author Rick Sherman has more than 20 years of business intelligence and data warehousing experience, having worked on more than 50 implementations as a director/practice leader at PricewaterhouseCoopers and while managing his own firm. He is the founder of Athena IT Solutions, a Boston-area consulting firm that provides data warehouse and business intelligence consulting, training and vendor services. Sherman is a published author of over 50 articles, an industry speaker, a DM Review World Class Solution Awards judge, a data management expert at searchdatamanagement.com and has been quoted in CFO and Business Week. Sherman can be found blogging on performance management, data warehouse and business intelligence topics at The Data Doghouse.You can reach him at [email protected].

About the Sponsor expressor software tackles the complexity and cost of enterprise IT projects with data integration software that delivers breakthrough development productivity and data processing performance at a significant price/performance advantage. expressor’s patent-pending semantic data integration systemtm is based on common business terms to enable collaborative, role-based team development, business rule reuse and end-toend project lifecycle management that cut total data integration costs in half. expressor software is attracting global 2000 and mid-sized customers around the world that are looking for a smarter, faster and more affordable data integration solution. expressor software has also developed close business relationships with complementary technology partners, independent software vendors (ISVs), original equipment manufacturers (OEMs), system integrators (Sis), and value-added resellers (VARs). The company’s technology partners include Netezza, HP, and Intel. Emunio Consulting, CME/emergent-I, Axis Group, G7, and BAAX are amongst expressor’s fast growing list of VAR and SI partners. expressor was founded in 2003 by experienced data integration and data warehousing practitioners and executives. The company is headquartered in Burlington, MA and is funded by Commonwealth Capital Ventures, Globespan Capital Partners and Sigma Partners. For more information visit www.expressor-software.com. This white paper, in whole or in part, may not be photocopied, reproduced, or translated into another language without prior written consent from Athena IT Solutions. This edition published January 30, 2008

expressor software corporation 1 New England Executive Park Burlington, Massachusetts 01803 +1 781-505-4190 Phone +1 781-505-4197 Fax www.expressor-software.com

PO Box 178 Stow, Massachusetts 01775 +1 978-897-3322 Phone 978-461-0809 Fax www.athena-solutions.com

Page 13 of 13

Related Documents

Roi
November 2019 30
Roi
April 2020 27
Roi
April 2020 19

More Documents from ""