IRM UK EA Conference – Outsourcing and EA

I presented a session on Outsourcing and EA at the IRM EA conference last week; specifically how, as Enterprise Architects, we are in a prime position to ensure that outsourcing deals are both created and run effectively as we are in the unique position of having the knowledge and understanding of both the business and IT across the entire enterprise.  We likened EA’s to the Spartans in the battle of Thermopylae who held off an army of (allegedly) a million men for seven days with only 300 warriors – primarily because they understood and had a map of the landscape.  (Unfortunately they were betrayed and slaughtered after a week – hopefully the analogy doesn’t stretch that far!).

Research by both Gartner and AT Kearney suggests that around 1/3rd of outsource initiatives fail.  We discussed how use of our architecture knowledge and artefacts can mitigate the risks of failure and how EA can be used to bring greater success.  We touched on our work to help organisations use EA and Essential together to reduce the outsource transition time (from idea to completed transition to a new provider) from a typical 18-24 months to 6-9 months, which addresses a key concern raised by the FCA.  We showed some examples of how Essential has been used to support such initiatives across a number of organisations.

The conference itself was very interesting and it seems to me that EA is really coming of age – there were many talks showing how EA is used in organisations to provide real and concrete benefit to senior managers.

If you would like a copy of the presentation then drop me an e-mail at the info at e-asolutions.com address.

Essential Information and Data Pack

It’s been a couple of months since we released the Information and Data Pack and I thought it would be useful to take a more detailed look at what is in this extension to the Essential Meta Model and what it can do for us.

 

Firstly, thanks to our community members who have given us some feedback and found a couple of small bugs in there. We’ve started a forum thread to catch any more issues as we find them but will be releasing a patch and new version of the pack in the coming weeks – we wanted to make sure we had as many issues addressed in a single update as possible. If you’ve found any issues in there, please let us know in this forum.

 

This optional extension pack is a major extension to the Information layer of the meta model but although there are some tweaks of the core meta model elements, this is mostly an extension to what was already there.

 

We have added a number of important meta classes for managing Data elements and the relationships that these have to Application, Information and Business elements, enabling the resulting models to support Enterprise Data Management and Enterprise Information Management activities. More about that later in this blog.

 

One of the most important concepts in the pack is the separation between Information and Data. We have defined strong semantics for what is Information and what is Data, so that there is a clear and consistent framework for capturing and managing these elements of your Information and Data architecture.

 

We’ve based this on the commonly used “Information is Data in context” definition. Data elements have the same meaning and value regardless of the context in which they are used, whereas Information elements differ depending on the context. e.g. Product data means the same things regardless of whether we are looking it in the context of Stock, Sales, Manufacturing. That’s not to say that we only have one Product data element in our architecture, we can define as many variants of how Product data is managed in our organisation as we need – in terms of the attributes and relationships. e.g. Product Item data might have a different set of attributes to a Bill of Materials data object.

 

In contrast, Information takes data and uses it in a particular context. e.g. Stock Volume by Location might use data about Products at particular locations in the organisation and would have a different value for a single Product depending on the Location.

 

This separation of the Information and Data fits neatly into how we need to manage these things in the real world. Data combined and used to produce the Information that we need.

 

Naturally, we’ve used the Conceptual, Logical and Physical abstractions for the new Data metaclasses, covering the WHAT, HOW and WHERE for data elements. In addition to adding these new meta classes, we have created powerful relationship classes to them that enable us to clearly understand how Information and Data is being used by Applications and Business Processes. Some of this might seem a bit complex at first glance but this is due to the contextualised nature of these relationships. What we’ve created are constructs that enable us to understand what Information Applications use and create and in the context of that, which data elements are used, specifically, to deliver that Information – and in that context what are the CRUD for the Information and the Data. We believe that having these types of contextual relationships is a uniquely powerful capability of Essential Architecture Manager but we haven’t added these just because we can but because without them we cannot accurately or even reliably understand what is really going on in our organisation with respect to Information and Data.

 

So, what kind of things can we do with the Information and Data Pack?

We have designed the meta model to support Information Management and Data Management activities managing all the dependencies that exist between information and data elements and also how these are used by Application elements and how they are produced by Applications.

 

More specifically, in combination with the Views that are supplied out-of-the-box, we can understand where issues exist with our Master Data Management solutions, e.g.

  • where are we mastering Product data?
  • how is this data being supplied to our Applications?
  • how does this compare to how data should be supplied to our applications (according to our MDM policies)?

 

As you will have come to expect, the supplied Views are easy-to-consume, hyperlinked perspectives that are designed to give valuable insights to a wide range of stakeholders – not just Information and Data professionals.

 

We can browse the Data catalogue, drill into individual subjects, see what Objects we have in each subject and how each of these objects is defined. Further drill-downs can take us to views that should how this Object is provided to the relevant applications in the enterprise and from where.

 

Based on the detailed CRUD information that we can capture in the model, we can easily produce CRUD matrices from a wide variety of perspectives, e.g. CRUD of Data Subjects by Business Capability, CRUD of Data Objects by Business Process – both of which are derived from the details in the model, which means that these are automatically updated as the content of the model is updated.

 

One of the most powerful Views is also one of the simplest. We find that providing a browser-based, easy-to-access place to find accurate and detailed definitions for all the Information and Data elements – and most importantly in easy-to-understand terms – is a capability that is quickly valued, in particular by non-technical stakeholders. Deliberately, there are no ERDs or UML class diagrams in these definitions. Rather, we can explore the catalogue of the Information and Data elements in a highly hyperlinked environment, providing an automatically generated data (and information) dictionary to the organisation.

 

In that context, we’ve introduced some nice little features that will be included across all the Views such as the ability to identify the content owner of particular elements or groups of elements. This means that if we see that the content on a View is incorrect, we can click a link and send an email to that owner to let them know that things have changed or that there’s an error in the content.

 

We recognise that while most of the meta model concepts for the Information and Data pack are straightforward, there are some more complex areas, in particular in the definition of relationships. As always, we’ve worked hard to keep these as simple as possible but the reality of how Information and Data is used and produced is (or at least can be!) complex and we need to able to capture and manage these complexities. However, the capabilities and results are worth the investment in understanding how to use these. For example, the way the Information to Application (and in that context, to Data) relationships work, enable us to understand how a packaged application has been configured in terms of how the out-of-the-box data structures are being used. This means that we can understand where Product Codes are being stored in Location Code tables, for example, and this is the kind of scenario where the ‘devil is in the detail’ and fine-grained things like this can become the source of a lot of larger-scale issues in the architecture.

 

We’ve already had some excellent feedback on the pack and based on demands from real-world use of Essential Architecture Manager, we are now looking to extend the granularity of the application-to-data relationships to include data attributes, rather than just data objects. You might not always need to go to that level of detail but if you do, the capability will be there – and this will be designed to enable you to model at asymmetrical levels of granularity.

 

Although it’s currently an optional extension, the Information and Data pack will be incorporated into the next baseline version of the meta model. We think that the game-changing capabilities of the Information and Data Pack are a vital part of the enterprise architecture model and so it is natural that this extension become part of the core meta model.

 

How can organisations justify enterprise architecture in a recession?

Following on from my blog of 19th May, How can organisations afford the luxury of enterprise architecture in the middle of a global recession?, a more detailed article is available for download from the News and Industry section of the EAS website.  In the article I examine the reasons why Enterprise Architecture has become so popular and focus on the areas of importance when budgets are tight, including application consolidation and project portfolio management.

Is Enterprise Architecture only for big companies?

I read Mike Kavis’ blog Is Enterprise Architecture only for big companies? a couple of weeks ago and I think that the points that he makes are spot on.  I certainly agree with Mike that EA is as beneficial for an SME as it is for a large organisation.  How can understanding more about your organisation and having that information to hand ever be a bad thing when trying to make decisions, whether they are regarding technology, people, or the focus for the future?

I am also in complete agreement that 20% of EA is better than none.  I wonder sometimes if people confuse accuracy with completeness here.  Certainly if you are working from the wrong information then this could be more damaging than having no information.  But partial information must be better than none, especially if you are picking the elements of EA that are important in that instance.  The key in any organisation, large or small, is to identify your most pressing business need and use EA to solve that need (see my blog from March 15th – Identifying the problems your EA needs to solve).  Some kind of framework is important in as much as you need a consistent approach to your EA efforts, but this is more pressing in a large organisation with many more architects, each with their own perspective!

As regards cost, if you have had a look through our website, you will have noticed that one of our aims is to make EA more accessible to SME’s.  It is our view that it is hard to ‘do’ EA without a tool (see Jason’s blog from Feb 19th – PowerPoint and Excel for Architecture Modelling; Why Not?), and that, to date, SME’s have been unable to join the party, so to speak, because buying and introducing commercial EA tools often requires significant investment.  We hope that Essential, being free and relatively easy to install and use, has removed at least this barrier.

The Devil is in the Detail

If architecture models are to be used to help make decisions about change in your organisation, those models must include details that reflect complexities of the real world.

The Problem

I am frequently asked what is becoming for me a classic question, “How far should I go with the detail of my model? When do you stop?”

In general, I would say the more information you can capture, the better. We do need to be able to “see the wood for the trees”, though. We don’t want to get lost in the detail – both from a point of view of the sheer volume of information and also from the perspective of capturing things that are not currently interesting.

Many models and frameworks tend to stay at quite a high level. This can be OK for some scenarios – e.g. high level design or a Powerpoint. However, detailed decision support enabled by information from the architecture model requires the detail. If this is not available then all too often, when the details come to light, they break the model.

When interacting with other parts of the organisation, architects who gloss over the details, can often be perceived as ‘arm wavers’ and can quickly lose credibility – even though the ideas themselves may be sound.

In the increasingly technologically educated world, you ignore the details at your peril. The very systems that we have to manage, as Application and Technology Architects, are becoming more and more complex. The Cloud, SaaS can provide significant benefits but the services that they provide still need to be managed as part of the Enterprise Architecture. The ‘details’ are still there but now we’re not directly in control of them. We cannot just ignore them because we have a service provider that manages how that service is delivered.

Really, we need to be able to ‘roll up’ the detail when required. Also it would be great to be able to easily come back to elements of the architecture that we didn’t initially capture to any great depth, and enrich the detail later, when we have it.

I’ve used some technology examples above, but the need to recognise the details exists just as much in Business Architecture. One of the big issues here is that we cannot ignore IT when looking at the Business Architecture. It is the reality of today that nearly every process in the organisation is supported by IT in some way.

A Way Forward

So, how can we manage the details without suffering information overload or finding ourselves in a modelling equivalent of painting the Forth Bridge?

Generally, details are missed because when we uncover them, we have nowhere relevant to capture them. So they are ignored or forgotten. If we could capture all the relevant details as we find them – and model them so that everything we know about some aspect of the enterprise can be related as appropriate to others – this would really add to the organisation’s ‘knowledge base’.

Now, modelling of this nature requires a detailed ‘language’ if we are to be able to reflect the nuances of things at the detailed level. However, this language must enable us to “black box” elements in the model – to save us from getting lost in the detail and deal with them at the right level of granularity for the activity that we’re working on right now.

The important thing about not ignoring the details is that our model needs to reflect the real world if we are to be able to make effective decisions from the knowledge we have captured. This might mean that the modelling activity or the model itself requires some complexity. We cannot hide from modelling complex things when they are truly complex in the real world.

The ability to capture the required details yet roll elements up as “black boxes” is one of the key drivers behind much of the development of the Essential Meta-Model. We started with a simple set of core concepts and introduced additional concepts incrementally to manage the detail as we found we needed it.

Don’t get lost in the details of your enterprise. Embrace them and manage them locally, in their relevant context (don’t worry about the servers if you’re looking at how your applications support your business processes!). Understand how these contexts, which are the layers and views of the Enterprise Architecture, fit together and the details will take care of themselves.

If something’s hard to do, then it’s not worth doing – Homer Simpson

The above quote from Homer Simpson could be applied to a lot of people when they consider embarking on Enterprise Architecture for their organisation.  Luckily, I’m not one of them!
In principle Enterprise Architecture isn’t hard at all.  I was trying to explain what I do to a friend of mine who is a builder/runs a swimming school and he couldn’t really understand the issues.  It seemed inconceivable to him that an organisation could have little understanding of the applications they are running and all the links between them; or no understanding across the organisation of the business strategy and what changes this would require from the business and IT communities; or be running two applications that actually do the same thing; or have two processes that achieve the same result.
This, of course, is symptomatic of the difference between a small business and a large global organisation, but it highlights that in principle Enterprise Architecture is a no brainer.  You simply need to understand the processes that your people are going to perform to achieve your business strategy, the applications that will assist them, the information that the people and applications will require and the underlying technology that will allow the applications to run.  Once you have this understanding, making adjustments, whether to streamline your business or because of a change in your strategy, is easy.
Of course, in practice ‘doing’ EA is never easy, and the ‘understanding’ element is only part of the equation.  Even if you understood all the elements listed above, achieving the ‘control’ required to manage change is still a significant challange.
Nonetheless, contrary to Homer’s advice, even though doing EA is hard, it is definitely worth doing.

What are the Essential Project Team up to?

If you’ve been keeping an eye on the forums you’ll know that we’re in the process of putting together a sample repository.  The sample will take a fictional investment management company as an example and show a thin slice through all the meta-model layers to demonstrate the linkages between the different architecture concepts.  We chose an investment management company as it’s an area we know well, but also because we think it is broad enough to be understood without having to be a domain expert.  We’re currently populating the model and are expecting to get it out in the next couple of weeks – time permitting.
Something we’ve been asked a number of times via the forums is for a pictorial representation of the meta model, showing the links within and between layers.  We have been working on this and it is nearing completion so we hope to have it up on the site within a week or so, again, time permitting.
With many people trying out Essential now, it would be great to hear views on it, and ideas as to where the tool, the website or the documentation could be improved.  Please do let us have your views on the Forums.

Identify the problems your EA needs to solve

In my opinion, when introducing enterprise architecture into an organisation the most crucial aspect is the identification of the problem you need it to solve.  And to do this effectively I believe that you need to clearly understand the aims and objectives of the business at that time.

We often find the EA team sitting within the IT organisation, but the team still needs to ensure that the identification of need is business led and that the EA is not used simply to document technical IT issues.

For example, it may be that the most important objective for the business at a point in time is to reduce costs to remain competitive in its pricing.  Enterprise Architecture can meet this need in a number of ways depending on the company in question, but one way will be to capture the current state in the application architecture layer to allow consolidation of applications and thereby achieve cost reductions.

However, this would be a wholly inappropriate response if the business need were to expand into new markets.  In this case the Enterprise Architecture should be identifying common processes and services that could be shared across divergent businesses to aid speed of change and agility.

This is at the crux of the reason that, to date, many companies are not satisfied with their EA initiative.  All too often the EA will follow a prescribed, often IT based, root; lets capture the current state first; lets create an application catalogue; lets sort out our integration architecture; without first determining what the business needs it to do.

Once the link to business need is in place the EA can begin to provide what the business needs and positive results will be quickly seen.

Once the business need is identified, it’s time to consider the tools, people and processes needed to support it.  In terms of processes we have identified two major contributing factors to a successful EA, ‘understanding’ and ‘control’.

By ‘understanding’, we mean having a coherent view of the business and IT assets of your enterprise readily available and accessible in order to support well-informed tactical and strategic decision-making.  As the amount of information to pull together is often large, tools with the right information captured will give the understanding part of the equation.

By ‘control’, we mean having appropriate processes and techniques in place to manage and co-ordinate changes to these assets in line with these decisions.

In general, many of the problems that an EA will solve will be IT related, but the link with the business is imperative to ensure that the problem is the most relevant at that time and, crucially, that the EA Team are seen as making a real difference to the business in a relatively short time space.

How can we exploit existing architectural information?

Even if you are just starting out with your enterprise architecture initiatives, it is highly likely that you already have architectural information about your organisation captured in some form or other. How can we exploit the existing architectural assets with Essential Architecture Manager?

In a bit of contrast to Jason’s last posting, I’ve taken a more technical perspective this time.

Even if you are just starting out with your enterprise architecture initiatives, it is highly likely that you already have architectural information about your organisation captured in some form or other. How can we exploit the existing architectural assets with Essential Architecture Manager? Can we automatically load these assets into the Essential Architecture Manager repository? And if we can do that, can we get information out of Essential Architecture Manager to be loaded into some of our existing systems?

These are questions that we quickly encountered when we started using Essential Architecture Manager in real organisations.

Potential Approaches

Since Essential Architecture Manager is built on Protege, we started to explore the tabs that were already available for bringing external data sources into your Protege project, e.g. the very useful DataMaster.

It is often the case that existing sources of information lack the structure of a formal meta model such as that in Essential and when they do, there are bound to be meta concepts that do not directly map to the Essential Meta Model.

We therefore find ourselves with a typical application integration mapping and transformation scenario. In order to import the information from the existing source, we may have to combine or split elements from the source information in order to map it to the target. We may also need to create inferred elements from the source information.

The existing plugins for Protege lacked this mapping and transformation capability, and so at first, where an import was only needed as part of the initial start up of the initiative, we found that we could construct transformation scripts, e.g. XSLT, to generate entries for the .PINS file of the Protege project and then paste them manually into the file – an approach known as ‘PINS hacking’ in the Protege community.

While this solved some immediate problems, we identified a number of problems with this approach, such as importing into projects using a database backend (there’s no .PINS file to hack!), creating new, unique instance IDs in Protege, and of course on-going synchronisation between the Essential repository and updates to the external information sources.

The Solution

It was clear to us from this point that in order to reliably and predictably import information into Essential Architecture Manager, we needed to be interacting directly with Protege through its API and not through its underlying data store – good practice for any data integration really. This way, Protege could take care of creating unique instance IDs,  defining relationships between imported artefacts etc.

We found that through the Script Tab, we could “drive” Protege, via its API, in a way that effectively automated the steps that you would do if you were using the front-end GUI. This way, we knew that Protege could properly manage the integrity of the repository. We just needed an effective way of turning the source information into Protege scripts that we could run and we’d have the basis of our solution.

We’ve now been using what we’re calling the Essential Integration Server for several months to synchronise the Essential Architecture Manager repository with an external configuration management suite. You may have noticed that every class in the Essential Meta Model has a slot called ‘external_repository_instance_reference’. This is used by Essential Integration Server scripts to synchronise individual assets between Essential Architecture Manager and one or more external information sources. An instance in Essential can have multiple external references and we can combine information from multiple sources to build a more complete picture of an architectural asset in the Essential repository.

The mapping between the existing information source and the Essential Meta Model is put together by defining an XSLT (or any similar approach) that writes the import script. We are building a library of useful Python script functions that help with things like creating instances (or returning a reference to it, if it already exists in the repository) or building certain more complex relationships. Using the script language is fairly straight-forward and is almost as productive as a more graphical integration tool – mainly because we have the power of the full Protege API plus a rich scripting language that enables us to handle any import / integration scenario. The Essential Integration Server provides a web-based user interface for running these XSLT scripts on the source data and producing the resulting Protege scripts automatically. This is particularly useful when you are running the import on a regular on-going basis.

However, the Essential Integration Server is not quite fully automated and that’s why we haven’t released it yet in the same way as the other components. Currently, it does everything except run the scripts for you in Protege’s Script tab. That’s the manual step that we are working to automate at the moment.

Getting information out of Essential

I’ve described in some detail how we recommend that existing information is imported or synchronised into Essential Architecture Manager. How about getting the information that is in Essential out to be used by other systems?

Essential Viewer already provides that capability. Rather than producing a ‘report’ that renders HTML, you can simply build a report that produces, XML, CSV files or whatever your systems need. We’ve used this with a lot of success and because it is run from within Essential Viewer, your target system (or your integration environment) can request this extract via HTTP as a REST-style web service.

Complete solution coming soon

The solution for getting information into and out of Essential Architecture Manager is there. It needs a little more work to make importing fully automated but what we have today is being used in anger on a regular basis in a real organisation right now.

If you need to get your existing information into Essential right now, let us know and we’d be more than happy to supply the current version of Essential Integration Server and to help you build the mapping to the Essential Meta Model.