My latest presentation is now available on BrightTalk. The talk is about how organisations can use SAP PowerDesigner to help them ensure that their Business Transformation is effective. I cover the following topics:
– 6 simple questions businesses need to provide the answers to form a framework for enterprise architecture – How PowerDesigner supports Enterprise Knowledge and Enterprise Governance – How SAP PowerDesigner provides the essential impact analysis – Real-world examples of PowerDesigner use
SAP PowerDesigner allows users to build a blueprint of their current enterprise architecture and visualise the impact of change before it happens.
One of the key IT systems for any insurance broker will be the Quotations system, where they record details of all of the insurance companies they represent, the types of risk they might cover, all their customers, all the risks the customers would like to have covered, all the possible questions they could ask them to assess risk, all the answers they received, and the resulting quotes. In a commercial insurance broker, those quotations could be very large and complicated.
About 15 years ago, one such insurance broker’s quotations database had been designed to handle any product, any type of client, and any risk they might want to cover, without requiring any structural changes to the database or much change to the user interface. It was what we call a ‘metadata-driven’ database.
This was good for the business because they focused their attention on defining their requirements, and it was good for some of the developers because all they had to do was turn those requirements into data to store in tables – there were a lot of code tables, “question” tables, and “answer” tables. This knowledge was good for job retention.
However, the database design was not good for anybody who had to get data out of the database, perhaps to print out a quotation – to extract the right data you had to look up all sorts of code values to find the right words to use for labels, and the right data to extract.
So, they decided to extract the data from that database into another database that would make it easier for someone to look at or report on the data. Their data modelling tool was PowerDesigner (then owned by Sybase). For the nerds among you it was version 11 – SAP have recently issued version 16.7; under the numbering system Sybase used we’d probably be up to 26 or so.
Modelling and building the new database was not the biggest concern they had, their biggest headache was describing how to convert the quotations data into the new format in such a way that the developers could build it and the testers could test it. They also need to be able to migrate some of that data to their Policy Management system, and develop messages for their messaging infrastructure but I won’t talk about that, this tale is long enough already.
One day, the project team were surprised to see the head of the company’s Information Architecture team, who most of them knew at least by sight, come into their office and sit down at a PC in the corner. He sat there for two weeks beavering away on, well, something. Nobody would tell the project team what he was doing, so they assumed it was something hush-hush that needed to be done away from the rest of his team. All very mysterious 😊.
At the end of the two weeks, the outcome of his sweat and toil was revealed – it was an Access database. Please, don’t groan, especially if you can guess what’s coming next.
The Access database was, admittedly, an amazing thing – it was designed to hold the Physical Data Model of the Quotations database, as well as the Conceptual Data Model representing the business data, the Physical Data Model for the new database, and the links between those models. It could import the data models initially, but all modelling changes had to be entered manually. The database generated the specifications to tell the developers the rules for moving the data between the two databases.
Applying the changes was a bit tedious, as the team had to make the model changes in PowerDesigner first, then make them again in the Access database, and there was nothing visual about that database, no ERDs to look at, just lots of metadata. The modelling work was slowed down considerably.
I can hear your thoughts already, you’re wondering “What’s the point of this long ramble, is it another dig at Access databases?”.
I’ll leave you to figure that one for yourself, just let me tell you one interesting fact.
PowerDesigner version 11 already supported everything the Access database did – nobody had investigated the capabilities of their existing tool.
The first online PowerDesigner training course was released recently, covering the Excel Import Wizard that is supplied with PowerDesigner. This allows you to import practically anything from Excel into any PowerDesigner model without writing a single line of code – if you want to import content into a type of model object or property that does not exist yet, it can even define them for you.
A Canadian customer was kind enough to say that the course easily justifies the cost:
“I found it extremely useful and became aware of a lot more functionality that I was not aware of around how Excel can be utilised within PowerDesigner. This will apply directly to the work that we are doing as it will allow us to get stakeholders to populate Excel templates that our data modeling team can then develop into actual PowerDesigner data models.“
In my last blog post I described my new method for building scripts. Well, last week I delivered my first script to a client in France – it’s a script to import source-to-target mappings (otherwise known as Data Lineage) from Excel. The client has been relying on Excel to document the movement of data up to now, and that’s not a very good way of providing real data lineage that you can enquire on. By linking the PDMs together they’ll be able to construct much more effective data lineage in PowerDesigner.
Anyway, the client loved the script, once I’d made a couple of changes to allow for the fact that (unsurprisingly) they’re using the French version of PowerDesigner. Using my model-driven approach, it was simple for me to update the affected components and re-generate the complete script. Voila, as they say in France.
If you attended my DAMA UK webinar last year, thank you. If you didn’t, where were you? <grin>
The webinar description was :
“To be successful at Data Architecture, organisations have always needed to understand what data they have, why they have it, how they use it, where it is, how it got there, and what happened to it along the way – forming a ‘pipeline’ of information. Very often, this pipeline is managed and documented using a mish-mash of tools and methodologies, often resulting in conflicting and contradictory information, making it difficult for Data Governance to be effective. In this webcast, George McGeachie will demonstrate the key features of SAP PowerDesigner that support the pipeline. A set of requirements and business rules, and an existing Conceptual Data Model, will be used to create Logical and Physical Data Models. This will include the generation of JSON structures from both Logical and Physical Data Models. Some of what you will see is not out-of-the-box – it has been built using PowerDesigner’s powerful customisation features.”
Well, I think I demonstrated that pipeline okay, though I did make one slight mess-up on the JSON PDM (see below). That’s the way it goes with one-off demos, the old memory plays you tricks. Here’s a quick reminder of the steps I followed
Import Requirements from a Mind Map – Via OPML and a Word document – then create a Conceptual Data Model (CDM) from the Requirements. The OPML import was built by me using the PowerDesigner XML Import feature, and the Word import is part of the product.
Merge this CDM into the existing Hotel Loyalty CDM
Normally I would be more selective about the changes that I carry over from one model to another, today I carried over every change that didn’t delete something in the target model.
My objective was to show that the links back to the original requirements were still maintained. Here is the link from one of the Hotel Loyalty CDM Entities, back to the original requirement:
I also showed the additional Entity properties that I added via a model extension:
This included a link to a new type of object I’ve added to the model – the Data Steward. Here we can see the list of entities that one of the Data stewards is responsible for:I didn’t show you this, but the model also contains a Dependency Matrix, which can be used to edit the links between entities and data stewards. It can also be used for simple validation, such as showing only those entities that do not yet have a Data Steward.
There is also a matrix that allows you to edit the links between Data Items and Domains.
Generate LDM from the CDM
I generated a Logical Data Model (LDM) from the CDM, making sure not to run a model check, as errors would prevent the new model being generated. I didn’t show you the model options, where the option to allow many-to-many relationships was disabled.
There is a many-to-many relationship in the CDM, between Facility and Reservation – I’ve called this “Facility Reservation”:
In the LDM, an entity was automatically created to replace the relationship:
It’s worth mentioning here that I can do this in the CDM or LDM at any time – any relationship can be converted into an entity, preserving the overall optionalities and cardinalities.
The LDM has a couple of model extensions attached – one of them generates JSON and CSV, and the other generates facts about the model as text. Both involved writing some Generation Template Language (GTL), which allows me to mix boilerplate text and information from the model, in much the same way as when we generate Data Definition Language (as SQL) from a Physical Data Model.
Here’s a JSON sample:
An entity summary:
Generate JSON PDM from LDM, then generate SQL Server PDM from LDM
I could have used a subset of the LDM as my source for generating the two PDMs, but I chose to generate the whole model.
In the LDM, I could now see (though I didn’t show this at the time) the links from the LDM to the objects in the PDM. Here’s an example what happened to one of the entities:
I didn’t look at the two PDMs in any detail, as the purpose of the webinar was to show the pipeline of artefacts and models. I did look at the Preview tab for one of the tables in the JSON model, which was where I made my mistake. A JSON model represents a complete JSON structure, possibly a message, so I should have looked at the Preview tab on the model. Here it is:
Use the Mapping editor to create mappings from the JSON model to the SQL Server model
In the interest of saving time, I allowed PowerDesigner to populate the default mappings, which mapped everything in each model to the equivalent in the other model.
Load the JSON-SQL mappings into a Data Movement Model
The Data Movement Model already contains some detailed data movements for MDM and a Data Warehouse:
I used the built-in wizard to import the mappings from the SQL Server PDM, and create a single transformation task in the model. This links back to the tables and columns in the two PDMs, and allows me to describe the transformation actions in detail:
I also showed an alternative view of the Data Warehouse Data Architecture, in an Enterprise Architecture Model, which is linked to the Data Movement Model. As an architecture model, it doesn’t contain the level of detail that the Data Movement Model does; what it adds is the ability to connect this data architecture to business functions, servers, locations, and other enterprise architecture artefacts. The Enterprise Architecture model also links to the Physical Data Models, ensuring traceability at the model level.
Finally, I created a Project, and added all the models to it, so we can visualise the links between models. each symbol on this diagram represents one of the models. I can open a model by double-clicking the symbol.
One thing left, which I’m sure I showed somewhere near the end. I showed the Impact and Lineage analysis for one of the tables in the JSON PDM. This followed all the links from object to object, to the entity it was generated from, to the source-to-target mappings, to the mappings in the Data Movement Model, and kept on going. For example, we can see that the table was generated from an entity in the LDM, and the LDM entity was generated from a relationship in the CDM (remember the many-to-many relationship we saw earlier), which is connected to the Facility and Reservation entities in the CDM. I haven’t expanded far enough in the diagram above to show the link to one of the requirements we started with, but it’s there.
I clicked on the ‘Generate Diagram’ button, which created an Impact Analysis Model from the above analysis. As this is a model, I could save it for future reference, and refresh the contents in the future. In this diagram, changes to the objects on the left could affect the table. each of the symbols is a link to the actual object, so I can access the CDM and LDM entities, the Data Transformation, and the related tables.
Again, thanks for attending, or reading this far if you didn’t attend the webinar. Please use the ‘Contact me’ option on the blog to send me a message, or find me on Twitter , Xing or Linked In.