DAMA Day 2016 - Bio&Abstract


Kelle O'Neal



Having worked with the software and systems providers key to the formulation of Master Data Management (MDM), Kelle O'Neal has played important roles in many of the groundbreaking initiatives that confirm the value of MDM to the enterprise. Recognizing an unmet need for clear guidance on the intricacies of implementing data solutions, she founded First San Francisco Partners. Under her leadership, FSFP immediately established a reputation as the first-call resource for companies looking to tap the value of Enterprise Information Management (EIM), MDM and Data Governance (DG). Kelle developed her ability to work through organizational complexity, build consensus and drive results in senior roles at companies such as Siperian, GoldenGate Software, Siebel Systems and Oracle. Kelle's strong background enables her to provide expert counsel to organizations seeking to execute an EIM, MDM or DG project. Kelle holds degrees from Duke Univ and the University of Chicago Booth School of Business.


Aligning Governance and Analytics to Ensure Trust and Transparency

The DGPO defines Data Governance as "A discipline that provides clear-cut policies; procedures; standards; roles; responsibilities; and accountabilities to ensure that data is well-managed as an enterprise resource." Many times the influence of Data Governance stops at data creation and management, rather than extending to the data consumption organizations like Reporting and Analytics.

Are your Analytics groups involved in Data Governance? In this session, we will discuss the importance of aligning your Data Governance and your Analytics organizations. We will show how these are intersecting communities add value to each other, context to the conversation, and end-to-end accountability for data.

We will answer the questions:

  • What are the organizational constructs that need to be considered to integrate Data Governance and Analytics?

  • What organizational change can be anticipated and how should it be addressed?

  • How do you design your data governance programs to support Analytics? How is this different than an operational use case?


Joe Caserta


Joe Caserta is a celebrated big data strategy consultant, author, educator and president of Caserta Concepts, an award-winning strategic consulting and technology implementation firm. His company specializes in Transformative Data Strategies, Modern Data Engineering, Advanced Analytics, Strategic Consulting and Technical Architecture, and Design and Build Solutions. helping clients maximize data value. Joe is co-author of the industry best-selling book The Data Warehouse ETL Toolkit (Wiley, 2004), a contributor to industry publications, and frequent keynote speaker and expert panelist at conferences and events. He also serves on the advisory boards of financial and technical institutions, and is the organizer and host of the Big Data Warehousing Meetup Group in NYC. 


Traditional data warehouses were built with online transaction processing (OLTP) technology and architecture that today can be 15-20 years old. These data warehouses were right for the time, but not designed to handle the volume, variety and velocity of information that businesses are faced with today. Over the years we tried to fit more and more data into these warehouses, and the result has been an over-burdened system offering a not very optimal way to access data in today's data-driven world. Companies can benefit from new technologies and innovative approaches to modernizing the warehouse for improved performance. There are new and existing technologies that, when blended effectively, can deepen business understanding through data analytics and support business needs. Moving to the data lake paradigm can mean scalable, agile performance and storage, and processing solutions for all data – whether structured, semi-structured, or even streaming data. In this session, I share:

  • Strategies to move from traditional data warehouse culture to the Big Data Lake

  • Guidance for designing a process for how the EDW can respond to the Big Data Analytics need

  • Understanding and maximizing the relationship of big data, Hadoop, and analytics

  • Steps for Using the Data Lake to “Put it all Together” - Ingest, Organize/Define/Complete, Blend and Learn, Report

  • Case study examples of how establishing a Data Lake can be a game-changer for today’s organization

  • Lessons Learned from client engagements

Martha Dember



Ms. Dember is an accomplished and respected IT executive with over two decades of experience and a unique blend of deep technology and strong people skills that allow her to align business needs with IT capabilities. Her peers consider her an expert in directing solutions to complex challenges and leading visionary programs which include Big Data, Analytics and Data Governance implementations. Practiced in management of business expectations and in the delivery of solutions from concept through implementation, Ms. Dember is considered an industry thought leader in BI, Big Data & Analytics.  Currently at Kimberly-Clark, Ms. Dember is establishing a global data governance and quality program while building out a "data marketplace" in support of Data as a Service / self service BI enabling the transformation of the company into a "Data Driven" Organization.


 Title: The Data Catalog in Big Data Governance”

This presentation will start by addressing the role of Data Governance in the world of Big Data. It will then continue to define what a data catalog is and how it is used to enable “data as a service” in a “Self Service BI” environment.   Points of interest that will be covered are:

  • What changes in the role of governance when we address Big Data?

  • The importance of analytics in any and every organization today?

  • What is meant by “Data as a Service” and “Self Service BI”?

  • What is the Data Catalog?

  • What are the development stages of the Data Catalog?

Don Soulsby


Don Soulsby is Sandhill Consultants Vice President Architecture Strategies. His practice areas include strategic and technical architectures for data management, metadata management, and business intelligence. 

Mr. Soulsby has held senior professional services and product management positions with large multi-national corporations and software development organizations. He has an extensive background in enterprise architecture and data modeling methodologies. He has over 30 years of experience in the development of operational and decision support applications. He is completing his qualification as an Enterprise Data Management Expert (EDME) with the CMMI Institute. 

Mr. Soulsby is an excellent communicator and has taught metadata, data modeling and data warehouse courses through public offerings and onsite engagements to corporate clients. He is a recognized thought leader who speaks regularly at international industry events, MIT CDO Conferences and DAMA functions.


Does Data Quality Matter in the Era of Big Data?

As an enterprise moves from historical business intelligence (BI) reporting to predictive analysis founded on Big Data, the significance of data quality cannot be underestimated. Witness a 2014 article in the New York Times where the author Steve Lohr states “Data scientists, according to interviews and expert estimates, spend from 50 percent to 80 percent of their time mired in this more mundane labor of collecting and preparing unruly digital data”. 

By observation, using conventional BI tools, the consumer of the information can drill down through the report to the detail when the aggregate number does not “feel” right. It is within this process that data anomalies can be detected. This will not be possible with Big Data due to the sheer volume and complexity of the data. The presence of very high quality data in Big Data collections will become vital for successful outcomes. 

This presentation will look at the elements of data quality that will be required to make effective use of algorithm based data analysis and predictive analytics. Based on the Data Management Maturity Model from the CMMI Institute, we will review the capabilities of an effectual Data Quality program. We will look to the competencies required to evolve to the level of quality necessary to support data analysis and analytics with Big Data.

Dan Sholler

 Dan Sholler


Dan Sholler is an experienced software industry expert, with broad and deep experience in the data and software. Dan began his career as a developer and product manager for BI and reporting software, moved into integration and middleware. He spent several years at Gartner as an industry analyst in the software space, covering data management, middleware, application architecture, and SAP. He has spent time at various software companies, and is now at Collibra as Director of Product Marketing.


DATA GOVERNANCE FOR BIG DATA: Enabling today’s  Data Stewards

Big data, big opportunities, big challenges? Using big data to drive new analysis of your business and your customers can generate tremendous returns. However, it also presents big challenges as users want both complete freedom to analyze the data, with data that is of very high quality. Balancing these goals demands a flexible and dynamic approach to data governance. In particular, enabling a variety of data stewardship tasks and models is critical to success.

Mandy Chessell

Mandy Chessell


Mandy is an IBM Distinguished Engineer, Master Inventor and member of the IBM Academy of Technology. She is a member of the IBM Analytics Group CTO office, working on strategic client engagements, solutions and emerging technologies. This includes the Data Lake, Next Best Action solution and the strategy for Open Metadata and Information Governance. Mandy is the author of multiple books including “Governing and Managing Big Data for Analytics and Decision Makers”


Metadata describes data in all its forms.  This includes where the data is located, how it is stored, how frequently it is changing, what it represents, how it is organized, who owns it and how accurate it is.  When good metadata is available, the data it describes can be rapidly located and assessed for new applications and analytics.  Without metadata, data-oriented projects are delayed while the team searches for the data they need.  In many analytics projects, this process can consume over 70% of the project resource. This session covers the role of metadata in today's complex data environments that span on premise systems and cloud services and includes all types of data coming from the smallest IOT devices to mission critical applications and large data lakes. How can open source projects such as Apache Atlas and advanced analytics automate the management of metadata so enterprises have a view over their data both from a governance point of view, and also to ensure individuals can locate and make use of the data they need in a data-driven operation.

Robert Quinn

Robert Quinn


Robert Quinn is a senior strategy and technology consultant with extensive experience delivering robust, integrated, and scalable data management solutions for companies across multiple verticals. An advocate for effective and efficient design, Robert is a pragmatic solution architect who champions a holistic approach to data delivery. He offers a rare ability to craft the vision with executive stakeholders, translate that into business requirements, and then engage and manage front-line teams through to delivery and adoption.

As a strategic partner to clients such as Prudential Financial, John Wiley & Sons, Pearson and The Hartford Insurance, Robert brings industry knowledge, bolstered by proven solution design and delivery methodologies to each engagement. A true business advocate, Robert quickly immerses himself into the language, culture, and long-term objectives of the companies he supports.

For more than 20 years, Robert has been responsible for the design, development and implementation of best-in-class solutions, including analytics, data consolidation/integration hubs, and data governance programs. He consistently adds value to the business by transforming requirements and business concepts into next generation data-driven applications. A proponent of both open source technologies and flexible architecture, Robert works with business and technical teams to ensure that solutions will evolve and adapt with the changing needs of the organization.

With an innovative viewpoint to each engagement, backed by years of experience at all levels of project initiation and execution, Robert adds value to every stage of solution delivery. His technology-agnostic perspective, effective communication style, and hands-on approach are the cornerstones of his successful practice.


Innovations in Data Governance, Architecture, and Analytics

Businesses are striving for more agility and for many use cases, data volume, source variety and update velocity continues to increase.

This session will describe how innovations in data governance, data architectures, and analytics are helping IT deliver and frontline business teams better utilize data to support critical business outcomes.  We’ll cover approaches used successfully across a range of clients as well as describe some of the new requirements and innovations we see coming over the horizon.

  • Big Data technology and approach

  • Machine learning

  • Crowd Sourcing Data Quality

  • Data Wrangling & Integration