DOQS REQUIREMENTS ENGINEERING SERIES
Quality-Based Data Modeling
Quality-Based Data Modeling improves the identification and analysis
of data requirements at the enterprise, project, and data base levels.
Through a synthesis with basic quality management principles, data model
specificity and rigor is increased, resulting in the absence of data omissions
and defects often encountered with traditional data analysis techniques.
By adding analysis paradigms to the data analysis toolkit, data errors
encountered in spite of rigorous normalization are corrected early in
the modeling process. The result is more complete data definitions, clearer
models, and an opening up of data-related issues that often remain buried
until late in the project implementation cycle.
This seminar explains how to define, verify, and validate data models
for information systems:
- Enables an understanding of the business through a focused data analysis
of customers, products, and the infrastructures that bring them together.
The resulting Enterprise Data Model can be used to align organizational
data and mission.
- Extends the value of traditional model components by altering modeling
techniques to use additional quality-based criteria; for example, using
supply-demand concepts to properly decompose many-to-many relationships
into their component associations according to how they will actually
be viewed and used by the business rather than simply by some non-value-adding
mechanical technique.
- Internalizes business exceptions by forcing the delineation of additional
levels of detail in the model as general rules so that exceptionless
processes become possible; reducing project requirements, and increasing
business value.
- Forces containment of business event data within the business allowing
for project and process segmentation through the differentiation of
maintenance-vs.-processing data requirements. Such segmentation allows
resources to be properly prioritized toward increased normalization
of dynamic processing requirements.
- Requires explicit definitions of troublesome data characteristics
that traditionally cause significant development and testing problems
on projects; e.g. status codes.
This seminar supports the broadest range of data analysis activities.
From enterprise-wide data planning, through traditional, relational, and
object data base design, data warehousing and expert systems, to APPLET
design for the latest Web pages.
Seminar Rationale
Organizations practicing data modeling activities during analysis and
design often fail to achieve the intended benefits typically associated
with data modeling. This seminar emphasizes the addition of quality management
principles to the data modeler’s toolkit in order to overcome the
common obstacles that prevent data models from having the desired impact
on projects.
The traditional push for logical data base design is deferred in order
to allow knowledge of the organization’s data requirements to emerge.
This allows a better alignment of the business process under analysis
with the mission and trends in the business. Project scope and complexity
are emergent properties of this modeling activity.
Seminar Uniqueness
Quality-Based Data Modeling continually de-emphasizes the technical
considerations of data and databases, instead emphasizing logical data
as a tool for understanding and impacting the business. The result is
an emphasis on creating an exceptionless data processing environment by
embedding what would have been process exceptions into the robustness
of the data models.
Traditional concepts in data models that are often taken for granted
as effective tools for analysis are shown to be among the most troublesome
features of many modeling projects. Among these: normalization; entity,
attribute, and relationship naming; and bi-directional relationships emerge
as troublesome obstacles to quality data analysis.
Topical Outline
- A FRAMEWORK FOR DATA
- Zachman’s Conceptual Levels
- ENTERPRISE DATA MODELING
- Who Are Our Customers?
- What Are Our Products?
- How Are They Related?
- CONCEPTUAL DATA MODELING
- Classes and Types of Entities
- Defining Relationships
- Optionality & Cardinality
- Attributes & Model Specificity
- 1st, 2nd, & 3rd Normal Forms
- Supply-Demand Paradigm™
- SUBJECT AREA DATA MODELING
- Supertype / Subtype Analysis
- M:M Conversion Techniques
- Disbursed Domain Paradigm™
- Discrete Detail Paradigm™
- PROCESS VIEW DATA MODELING
- Relationship Cardinality
- Derived & Implied Attributes
- 4th Normal Form
- Atomic State Paradigm™
- Logical-Physical Paradigm™
- DECLARATIVE DATA MODELING
- Domains & Atomicity
- Attribute Constraints
- In-Time Paradigm™
- Role Removal Paradigm™
- EVOLUTION OF DATA MODELS
- Time & Space
- Negatives & Denials
- Fuzziness & Completeness
|