Data Modelling Tool Requirements

From time to time, great debates arise in discussion forums about the pros and cons of various data modelling tools; inevitably, one or more people say that their favourite tools are the whiteboard and sticky notes. The whiteboard and sticky notes are great tools for workshop use, enabling rapid development of a model by a group of people without technical data modelling skills. Once a data model has been fleshed out on a whiteboard or a wall, it usually needs to be recorded in a more permanent fashion. Very often, the first choice is the old favourite analysis tool, the spreadsheet. Like the whiteboard and sticky notes, the spreadsheet is a great way of capturing information during and after a workshop, and doesn’t require expensive software licenses or special skills. By itself, however, it only provides a fraction of the capabilities required to construct, validate, communicate and use a data model effectively.

George McGeachie, Chapter 8, “Data Modeling Made Simple with PowerDesigner”, Technics Publications, New Jersey, 2011

Tools are necessary for data modelling, you can’t avoid having them, but which one suits you? Well, that depends on what you want to do with those tools, where you want to work in the Data Model Pyramid, and how much of the Zachman Enterprise Architecture Framework you need to cover. If you prefer to work from a data model that defines the scope of a modelling tool, take a look at The Metadata Archipelago.

There are dependencies between different types of data models, between data models and other artefacts or models that represent other aspects of business and requirements, the enterprise and solutions architecture, and application design.  The activities required to produce and manage data models are only part of a wider set of business and technology activities; integration with associated activities is key to the success of data modelling.

Without a tool that provides specialised support for data modelling, the data modeller cannot hope to work effectively in this environment.

Key features needed in a data modelling tool

With data modelling tools, you generally get what you pay for.  The more you pay, the more features are provided by the product. The wider your use of data models within the organisation, the more features you tend to need. The features you need in a tool can be categorised as follows

Core ModellingUsabilityInterfaces and Integration
Tool Management & CommunicationCollaboration

Core Modelling

My ideal data modelling tool must provide support for all the different levels and types of data models that you need to produce, with a choice of notations, and effective ways of deriving one model from another, and for linking models and objects in models. Traceability between models is absolutely essential.

In Physical Data Models, there must be direct support for all technology-specific objects, plus the ability to customise that support (this factor will cut down the list of viable tools considerably).

Other key features include:

  • customisable model validation
  • version management and access control
  • managed re-use of models and model objects, including model patterns
  • business vocabulary definition, mapped to usage within models


A tool can have every feature you could ever wish for and also be absolutely useless, if it’s too difficult to use or every action requires so many steps that it becomes tedious to use.

Here are some key usability features you should look for:

  • control over windows, toolbars, menus
  • straightforward automation capabilities – reduce the drudge work
  • diagram auto-layout, and control over symbol placement
  • a single editable view of the properties of any object
  • diagram image export
  • expert, timely support

Interfaces and Integration

Data models do not exist in isolation – it is absolutely essential that a tool can pass information to and from other tools and technologies. Take a look The Metadata Archipelago for my view on this subject.

For example, the tool needs to be able to:

  • Import and export data models
  • Create or update a model based upon information held in spreadsheets
  • Compare two data models of the same or different types, and update one or both models as a result (this may also be referred to as merging models)
    • include comparing two versions of a model
    • preview, print, and save comparison results
  • Compare a data model with an existing data artefact, such as an XML or database schema, and update the model and/or the data artefact as a result
  • Export data models in a format that can be imported into other tools
  • Generate test data to populate a test database or XML document
  • Build cross-references or traceability links between data model objects and objects defined in other types of models, such as business process or enterprise architecture models
  • etc.

Tool Management and Communication

All tools need to be managed; with modelling tools that will involve much more than just controlling user access. You would need to be able to:

  • Extend the tool’s underlying data model, allowing analysts to change the way in which model objects are defined and to define new types of model objects
  • Extract information from model objects for publication in various formats, such as html and documents
  • etc.


Some modelling tools expect modellers to work in isolation, with everyone creating their own models that nobody else will have access to. Fortunately, most tools are not like that. For example, a good tool is likely to provide:

  • A shared location for the storage of and controlled access to data models
  • Controlled access to models by multiple analysts
  • Controlled access to subsets of a model, perhaps individual model objects
  • Direct access to models and links between models via a portal designed for use by non-specialist users of data models
  • etc.

Leave a Reply