Enterprise Architecture Firm of the Year 2018

We are delighted to announce that, for the second year running, EAS has won top place in Corporate Live Wires’ Innovation & Excellence Awards.  Following on from receiving the award for Innovation in Enterprise Architecture 2017, this year we have been named as Enterprise Architecture Firm of the Year 2018 based on Essential and our consulting work around it.

The Innovation & Excellence Awards are judged by a panel of industry experts following nominations from the wider community.  The Essential Project, the world’s most popular open-source EA Tool, is now available as SaaS through Essential Cloud.  EAS, Essential’s founders, are all practising architects and we continue to extend and improve Essential by listening to our clients and our community and reacting to their needs.  We were first to exploit the connection between the information required for EA and GDPR and launch a GDPR support pack, bringing massive benefits to both the Compliance and EA Teams within an organisation.

We are continuing to develop Essential to support our customers and our community; look out for our innovative data capture and maintenance solution which will ease the burden of data management and revolutionise EA!

The EUs Gift to Enterprise Architects

Interesting article from John – ‘The EUs gift to Enterprise Architects‘ – discusses how EAs can make best use of the data that GDPR is forcing organisations to collect and keep updated, to increase credibility and value to the business.  Essential’s GDPR Monitoring pack uses this information to provide GDPR support, but also harnesses this data to allow analysis of more traditional EA fare, such as APM, Data Management and so on.  See a demo here, and our ‘5 Steps to Effective GDPR Monitoring‘ blog here.

 

Essential Version 6.1.1 Released

Essential v6.1.1 is now available from the downloads section of the website.

We’ve promoted some of the views that we’ve been trying out in the labs, including the NIST mapping and a couple of Business Capability to Technology views.  We’ve also enhanced some of the OOtB views, such as the Business Process Model, the Business Capability Model and the Technology Platform Model.  Meta Model extensions include a new class of ‘Business Process Family’ to make it easier to model and analyse standard and non-standard processes.

Full details of all the changes can be found in the release notes.

5 Steps to Effective GDPR Monitoring

Wherever you are on your GDPR journey, the 5 steps detailed below must be completed and can provide a useful checklist for progress.  This is based on several years’ experience we’ve had supporting the PII data requirements of global organisations.

  1. Assemble a Cross Business Team
    A successful GDPR initiative needs a number of different roles from across the Business and IT including, but not limited to, the following:
Role Purpose
Compliance The compliance team are responsible for defining the scope of the GDPR data for an organisation and also the allowed usages for GDPR data, i.e. defining the legal basis for use across the organisation. Compliance are also responsible for analysing the information returned and ensuring that remediation is put in place.
GDPR Coordinator The GDPR coordinator is responsible for ensuring that each business unit provides the detail of the data they process, the purpose and the applications used.  They should brief Business Units, coordinate and QA the information returned and manage queries.
Business Units The Business Units are responsible for providing the detail of the data they process, the purpose and the applications used for their business area, accurately and completely.
IT The IT teams are responsible for providing the detail of the applications and systems they are responsible for, accurately and completely.
Project Manager Create plan, coordinate resources, manage dates and deliverables and provide senior management reporting.
Analyst The analyst is responsible for analysing and modelling the data received from IT and the Business Units, for example, ensuring there are no duplicates, and providing this to compliance in a format that they can utilise to manage GDPR.
  1. Define the Data in Scope for GDPR and Define the Allowed Data Uses
    The data that is in scope for GDPR will vary from industry to industry, and organisation to organisation and each organisation must, therefore, define the data in scope for them. They must also define the data that is allowed to be used for each business purpose and whether or not consent is required.  We would recommend doing this before the fact-finding exercise as it provides a structure and minimises the possibility of duplication and data gaps.
  2. Get the Business Teams to Provide detailed GDPR Data
    The Business Teams will need to provide the data on their processes, purpose, data and applications used. Additionally, IT will need to provide information on the data held in databases, where the databases are stored and located and the security surrounding both the applications and the underlying technology.  There will need to be a standard means of capturing this detail to ensure consistency, so make sure the business have clarity on what they are doing – utilise your Data in Scope for this.  Once this data is provided, a central team should QA and analyse the data to ensure it provides an overall view of the business situation regarding GDPR.
  3. Gap Analysis and Action Plan
    A gap analysis and action plan should be created to work towards GDPR compliance. An ongoing process should be created to ensure this is an on-going exercise that continually demonstrates compliance.  Engage both the business and the IT teams in defining this process.
  4. Report to the Regulator
    The regulator will need to see evidence that you are on top of the new regulations; you will need to demonstrate that you have assessed your organisation against the new regulations, that you understand where you are compliant and that you have a plan in place to rectify any issues. They will also want you to demonstrate that you have a plan in place to manage GDPR as on-going commitment within your organisation, i.e. people, processes, technology, changes.

EAS have formed a partnership with UST Global and released the Essential GDPR pack, which enables organisations to understand their GDPR compliance adherence and risk from both a business and an IT perspective.  The objective of the tool is to demonstrate to both your CEO and the Regulator that the GDPR position is understood and under control; this is achieved through a series of interactive dashboards and detailed views that can be viewed online or printed out to suit the needs of both key stakeholders.

Our feedback indicates that whilst organisations have assembled teams and started data capture, many are proposing to manage GDPR compliance in a series of spreadsheets.  It is our experience that this is not sustainable; with such a large and constantly changing data set it is almost impossible to collect and structure the data in such a way as to answer all the regulators questions whilst keeping pace with change.  A GDPR tool with a comprehensive meta model, repository and adaptable viewer, allied to a very structured data capture process, makes this task achievable and, in fact, allows the data captured to be used to support other initiatives such as data management, application portfolio management and so on, enabling organisations to make use of the data that must be captured for GDPR.

EAS, in partnership with UST Global, can accelerate your GDPR initiative by bringing our combined experience and the Essential GDPR pack to:

  • Work with you to create a detailed plan to help you gear your organisation’s GDPR initiative for success, including the roles and responsibilities required across the business.
  • Work with your Compliance Team, or external organisations such as solicitors, to accelerate your initiative by providing quick starts based on our experience of the scope of GDPR data applicable to your organisation, and a business model that will aid understanding of allowed data usage.
  • Provide a set of pre-defined Questionnaires and Online Forms that direct the capture and analysis of the business and IT data required from your organisation.  Work with you to create a process to keep this data up to date.
  • UST Global, our partner, provides an automated data discovery tool that finds GDPR data in your databases and document stores, covering both structured and unstructured data such as PDFs. The results can be automatically loaded into Essential GDPR to supplement the manual data discovery carried out by business and IT teams to enable greater accuracy and accelerate the process.  The UST tool can also support the “Right to be Forgotten’ requirement, highlighting all the instances where a person exists across your organisation.
  • Essential GDPR provides powerful dashboards and visualisations to your GDPR data, allowing you to proactively manage your GDPR compliance and demonstrate to both your CEO and the Regulator that you are in control of your GDPR exposure, highlighting where you are compliant, where you have issues and where your risks lie.
  • Allow you to utilise the data that you have collected for GDPR to provide additional benefits across your organisation, such as identification of rationalisation opportunities, etc.

Find out more about the Essential GDPR pack or contact us.

 

Essential Cloud in General Availability

We are delighted to announce that Essential Cloud is now in General Availability

We’d like to thank all the clients that gave their feedback, allowing us to enhance Essential Cloud over the last six months.  We’ve updated the user interface, adding a journey history and improving collaboration capabilities through updated notifications, and we’ve extended the integration capabilities.  Importantly, we’ve had lots of positive feedback and no major issues, so we’re confident that Essential Cloud will prove to be as stable as Essential Open Source has been for the last eight years.

Find out more here

Essential GDPR Launched

Our GDPR pack is now ready for use.  Unique in the marketplace, it supports business questions such as ‘do I have a legal basis for using this data?’ and ‘have I captured the client’s consent?’ as well as technical access and security questions, such as ‘where is my data most at risk?’.  Most other tools are focused on one or other end of this spectrum.  High level dashboards show where the GDPR compliance issues exist, and drill down capabilities allow you to hone in on the exact process, application or technology that is the cause of the risk.

We have partnered with UST to, optionally, incorporate the use of their ground-breaking data discovery tool which can identify structured and unstructured GDPR data in databases and document stores across the organisation. This not only eases the burden of data capture but also provides an invaluable cross-check of information provided through more traditional means.

A sample of the dashboards are shown below, or you can read further information, access the GDPR demo viewer, or sign up  here.

Essential Labs

The team behind Essential are all practising architects and during our client engagements we are always having ideas on new views or improvements to existing views that can provide additional benefit.  We thought we’d make our labs available to our users so you can see what’s in development and make use of the updates immediately.

Often all (or if not all then most) of the data is already present in a repository, so these updates are really quick and easy to deploy and start using.  Feel free to download and use them with your own data if they look like they will help you.

Currently we have a new NIST compliance view, with a view loader, that enables you to monitor your NIST controls and assessments and some small updates to the Principles and Business Capability views.

The only thing we ask is that you tell us if you have suggestions for improvements or find any bugs.

Data Lens

You may have noticed from our site that the Data Lens is in beta.  It’s a lens that we’ve developed because we’ve been continually told that people don’t have control of their data.

In our EA consulting, we have seen:

  • Organisations that were unwittingly reporting incorrect MI figures because data was inaccurate or incomplete
  • Projects that intended to master and duplicate data that already existed in the organisation
  • Inconsistency in what people thought certain data was
  • Differing views on where data was sourced from
  • Projects repeating the same data collection work, asking the same questions again

The Data Lens looks to address this by bringing transparency and coherence to your data estate.  It is aimed at supporting the demands of people wanting to use data, such as:

  • Data Lake or Analytics efforts, which need to know information such as where data is sourced from, what terms are used for the same data, e.g. client and customer, how good the data is in terms of quality and completeness, etc.
  • Platform projects need to know where data masters exist, where data flows, how data is transformed, etc.
  • Any data rationalisation project needs to know where master sources of data exist, where duplication exists and how data is used.
  • Plus, Data Scientists need to understand the sources of data available for their analysis

The lens addresses these needs by providing a number of views and tools.

The Data Definition views provide data definitions, summaries and dynamically produced data models.

The Data Architecture Analysis views are geared towards you understanding sources of data, data flows, where duplication exists, etc.

Data Management is where the lens excels.  You are able to understand data quality across a number of criteria and see sources of data.  The Quality Dashboard shows the quality of the key data required to support your strategic objectives and business capabilities, and also the initiatives impacting that data.  This allows you to identify where your data initiatives may need to be focused to improve your business data output and enable your strategy.  The Data Quality Analysis page lets you pick the data you need and it then shows you where to source it from, plus the quality, completeness and accuracy of that data. This is really useful if you are using the data for other purposes, e.g. MI reporting or analytics. The data dashboard provides and summary view of your data which you can drill down into.

We see the Data Lens acting as the bridge between the tools that are more focused on the physical data layer, and which typically meet the needs of the technical teams but not the business users or the data scientists.  Equally, where you have conceptual data in a tool, the lens can act as the bridge to the physical data, removing the gap between the conceptual and physical layers, bringing context and meaning to the data.

The lens is currently in beta but we are allowing organisations to register an interest and we would love to get any feedback on the lens.

Essential Cloud – Available Now

Today marks a step change in the life of The Essential Project as we move to Public Preview of Essential Cloud, the final step before General Release. A cloud offering has been at the top of the Essential Community request list for some time and we have combined the best of the Essential Project with a cloud based service to provide additional enterprise capabilities. As well as all the benefits of Essential Open Source, Essential Cloud offers a comprehensive security interface covering both the instances in the repository and the viewer, a user-friendly, browser-based data capture interface extended to include tablet and mobile access, an enhanced viewer environment and single sign-on support via SAML. As this is a cloud service, technical support is automatically included as are platform updates, to ensure that you can keep up with the latest Essential developments with none of the hassle.

In line with our focus on value, Essential Cloud will be a low-cost option, with an annual subscription covering access to both the modeller and the viewer for unlimited users. We are not utilising a seat-based license model as the feedback from the Essential Community and our clients is that the key to an effective architecture initiative, one that provides value to the business, is to enable the users to own and update key aspects of the architecture, i.e. those areas that do not require modelling expertise, such as dates, ownership, governance models and so on. This spreads the load of keeping organisational information up to date and enables architects to focus on business value rather than being distracted with managing routine updates. A seat-based license model does not fit with this approach as the costs quickly become prohibitive; we would rather an organisation’s investment in EA is used to build out their architecture than pay for licenses.

To support this new model further, we are working with our user groups to design new data capture mechanisms that will provide business users with easy access and enable them to update information without having to understand the detail of the meta model or architecture modelling techniques. We already have some early prototypes, and we see this as an important way of enabling EA to continue to provide value to the business.

This is an exciting step in the broadening of the Essential platform, but we do want to assure you that we remain fully committed to Essential Open Source. This will continue to evolve in parallel with Essential Cloud and, crucially, the meta model will remain shared so both platforms will benefit from all advancements as well as the ability to move easily between Cloud and Open Source. Going forward we see the Essential Community consisting of both Open Source and Cloud users. We greatly value the contribution made by the community and we will continue to look to them to help us evolve the Essential proposition to ensure it remains at the forefront of knowledge driven decision support.

We have created an overview video showing Essential Cloud’s capabilities and we will also be holding a series of Webex’s where we will provide a demonstration of Essential Cloud and hold a Q&A session.

If you are interested in the Webex or Public Preview sign up here.

The Public Preview benefits are, of course, in addition to the existing benefits that are provided across both Essential Cloud and Open source:

  • Over 100 out of the box views focused on analysis, road mapping and decision making
  • Ontology based meta model for an entire organisation, with the ability to support other EA Frameworks
  • Import and export of data via unique excel import utility, with fast start view loaders, and APIs to integrate with existing data sources
  • Access to business focused lenses providing dedicated support for key areas such as Application Portfolio Management, Data Management, Strategic Resource Optimisation

If you are new to Essential then, Essential Cloud aside, one of our most exciting recent developments is the addition of add-on business focused lenses.

The lenses have been in our mind since 2010 when Jon Carter wrote a blog titled ‘Welcome to the View Store’, suggesting the concept of an app store for Essential. We were staggered by the interest and up take of Essential and so our early focus was on developing the tool functionality, but now we have made our earlier vision a reality.  Our business outcome focused lenses consist of a series of dashboards and views that respond to specific business problems, supported by everything you need to light up the views – data capture spreadsheets, import specifications, process documentation and videos.

The lenses have provided an ideal means for us to partner with organisations outside of the usual EA arena, allowing us to extend the use of Essential to cover different aspects of an organisation. For example, we have partnered with a strategic resource specialist to create a strategic resource optimisation lens which enables organisations to ensure they have the right staffing resources in place to meet the future demands of the business, such as the skills to enable digital business expansion. We have a couple of packs on offer now: Application Portfolio Management and Strategic Resource Management, and we will be expanding the offering shortly to include Data Management, and then, over time, we plan to move into many additional areas such as M&A, Outsourcing Support, Financial Regulations, etc.

We also have a set of low-cost View Loaders that provide the templates to do bulk data capture into Essential Cloud or Open Source. So, if you want to get bulk data into Essential quickly, the loaders can speed this up.

If you would like to develop your own pack to put on the view store, or talk to us about an idea for joint development of a pack, please contact us.

Configuring the Server Memory Settings for Essential

One of the most common issues with setting up Essential is getting the memory configuration of the various components configured correctly. As the complexity of the Essential meta-model and of Essential Viewer has increased then so have the memory resources required by the server to support them.

Right now, our current recommendation for a server is a multi-core processor such as an i5/i7 or Xeon equivalent and more importantly plenty of RAM. 4GB is a minimum but 8GB is more practical. You’ll struggle to use the Import Utility and Viewer together on a system with only 4GB of RAM. Assuming you’re running all the components on the same server (which is perfectly fine and can yield great performance) then here’s how we’d allocate the RAM across the main components…

  • Tomcat running Essential Viewer – 2GB RAM
  • Tomcat running Essential Import Utility – 2GB RAM
    • We’d install the Essential Import Utility on a separate instance running on different port e.g. 9080 as it improves stability and performance
    • If running both on a single instance then allocate 4GB RAM to Tomcat
  • Protege – 1.5GB RAM
  • If running a Database configuration we’ll ensure there’s about 1GB for that
  • We need some memory for the OS to run smoothly so about 1GB for that

This adds up to about 7.5GB. In reality, you’ll rarely use all that RAM simultaneously however this configuration is one we’ve used countless times with excellent performance.

So, now you’ve got plenty of RAM then how do you configure the components to use that.

First up, make sure you’re using the 64bit versions of all your components. If you’re running 32bit versions, you’ll max out a 1.5GB which will work whilst the repository is small but will cause you problems later on.

Protege

On Windows:

  1. Start Protege. Go to File->Preferences->Protege.lax
  2. Update the row for the property ‘lax.nl.java.option.java.heap.size.max
  3. This is set in bytes, so set this to 2048000000 for installs with the 64-bit Java environment.
  4. Click OK
  5. Restart Protege

On Mac:
If you run Protege on a Mac by double clicking an icon, you need to edit the Info.plist file that is hidden within that icon. Right click the icon (or ^-click for one button mouses) and click “show package contents”. A new finder window will come up. Double click “Contents” and then “Info.plist”. Traverse down the tree as follows: “Root” –> “Java” –> “VMOptions”. In VMOptions edit the -Xmx line to indicate the correct memory usage, e.g. 2048M. Note that this can be specified in megabytes by using the ‘M’ value.

For example, here are my settings:
<key>VMOptions</key>
 <array>
 <string>-Xms250M</string>
 <string>-Xmx2048M</string>
 <string>-XX:MaxPermSize=512m</string>
 </array>

Save the changes that you’ve made and restart Protege for these to take effect.

This principle also applies to the Protege server. If you have not already, update the ‘run_protege_server.bat’ / ‘run_protege_server.sh’ file to increase the maximum memory JVM option as follows by setting the -Xmx parameter:

For Unix / Mac / Linux:

MAX_MEMORY=-Xmx2048M -XX:MaxPermSize=512m

On 64-bit Windows platforms (with the 64-bit Java installation):

set MAX_MEMORY=-Xmx2048M

On 32-bit JVMs on 64/32-bit Windows, there’s a limit to how much memory can be allocated:

set MAX_MEMORY=-Xmx1536M

 

Tomcat / Essential Viewer / Essential Import Utility

The memory settings for the Tomcat that is running the Essential Viewer should also be set to around 2GB for 64-bit Java environments.

On Windows

If you are running Tomcat as a Windows service, you can set the upper memory limit using the tomcat8w.exe program. You’ll find this either in the start menu or in the install folder of Tomcat. This will pop-up a configuration panel.

  • Select the ‘Java’ tab and then set the parameter for the Maximum memory pool to 2048
  • Click Apply
  • restart Tomcat for these settings to take effect.

On Mac

If running the Viewer Tomcat on a Mac / Linux platform, you can set these using the ‘sentenv.sh’ file in <TOMCAT INSTALL>/bin and set the CATALINA_OPTS variable, e.g.:

export CATALINA_OPTS=”-Xms128m -Xmx2048m -XX:MaxPermSize=512m”
export JAVA_OPTS=”-Djava.awt.headless=true”

If this file doesn’t exist then simply create a new text file and save it as setenv.sh with these lines in it.

Again, you must restart Tomcat for these settings to take effect.
 

Troubleshooting

If things aren’t working as expected, then the Log files are your friends. The Protege log is in the Protege install folder under logs and is called protege_###.log. The Tomcat log is in the Tomcat install folder under logs and is called catalina.out. What you’re looking for is anything that mention “memory” or “heap”. If you’re seeing these errors then you haven’t properly configured the settings.

As always, you can post your questions on the Essential Forums at http://enterprise-architecture.org/forums and we’ll answer as quickly as we can. Don’t forget to use the search too as there are over five years of posts and there’s a good chance your question has been answered before.

Once you’ve got these settings right, you should have many years of stability and performance from your Essential Install. If you’re still having problems though and would like some professional support then contact EAS via the Services menu for more information on how we can help.