Showing posts with label APM. Show all posts
Showing posts with label APM. Show all posts

Thursday, 26 October 2017

Digital Portfolios

Soon, almost all enterprises will be digital. The smart, quick running start-ups of today will start to feel the pains of maturity as they expand the range of activities that they are involved in and the markets in which they operate. The old world survivors which have adapted will also feel the pain as their IT or should we now say Digital Estates become increasingly complex.

As this happens, many of them will start to appreciate the need for Portfolio Management of their Digital Assets. The constant churn of Digital as Usual (DAU) means that almost everything will be obsolete or approaching obsolescence and business requirements will keep changing. Enterprises will need to continuously assess their portfolio and prioritise improvements, changes and rationalisation, as well as the response to threats and changing legislation as governments react to Digital Disruption in a number of ways.

Anyone responsible for providing Digital Services to their enterprise needs to be able to deal with this and the complexity which lies underneath, so that they can spend wisely and assign resources for the most optimum effect. To do this requires a high degree of collaboration with other business functions to ensure that a balanced and appropriate approach is taken. This is where Digital Asset Portfolio Management (DAPM) comes in.

At its very simplest, DAPM is about Business Quality, Technical Quality and Affordability. These 3 things need to be monitored and continuously managed through the layers of a Digital Product. The key layers being Customer & Business Environment, End-to-End Product Process, Applications, Data, Infrastructure (cloud, virtual &/or physical).

Changes in Customer expectations, business trends, legislation etc. can impact the overall Digital Product's relevance, market fit and legality. This is a major aspect of business quality and will imply the need for change to End-to-End Product Processes. Although Product Processes may also be impacted by other issues such as changes in volume, ability to deal with increasing product complexity, scarcity of resources or competition (in performance terms) with other products in the market place (i.e. the bench mark suddenly shifts and your process has been left behind). Likewise, applications may fail to keep pace with changing needs. Data may become corrupted or inadequate due to poor information management or a bad fit between the data and the real needs of the business. Infrastructure gradually becomes obsolete, difficult to support or integrate, and weaknesses in security become apparent.


All this gets even more complicated if mergers happen and applications become duplicated, or technical strategy changes introduce new technical platforms into the enterprise.

One of the key things that DAPM has to do, is identify all the major components required to deliver a digital product (including containers and serverless functions) and keep track of their condition and costs. This allows the calculation of unit costs to support a product transaction and enables investments to be assessed in terms of not just impact on quality and effect, but also the on cost of supporting a product.

A mature digital organisation needs to build DAPM into its budgeting and planning activities, and use this to inform its technical strategy. Alternatively, businesses with a huge legacy problem who want to transition to digital, may need to use DAPM to identify and prioritise which applications need to be retired, replaced or upgraded to enable their move.

However you look at it DAPM is an essential digital practice.




Friday, 26 May 2017

GDPR, CIO issues, Lean Data & Data Portfolio Management

Last night's CIO event hosted by Harvey Nash and KPMG was held to launch their 2017 CIO Survey "Navigating Uncertainty".

In the Panel discussion afterwards, one of the key issues raised was about "knowing where your data is". GDPR is certainly driving this, for personal data in Europe and anyone who trades with organisations or consumers based there. As its difficult to implement the "right to be forgotten" if you don't know what data you hold and where it is. Similarly, SoX in the US has driven similar concerns about Financial data. The move to "Cloud First" also compounds this need, as it is core to successful integration.

So why is this such a big deal as much of GDPR is about doing things which a business really ought to be doing anyway? basically its ancient history. Most large organisations have grown partially by merging with and acquiring other organisations. Their management teams often have the tendency to declare victory before full integration occurs.

Then there are cost cutting issues. Most businesses have been through boom and bust cycles of  large investment followed by cost cutting and asset squeezing. Often this has included head count reductions or outsourcing. Each of which ensures that knowledge about where things are leaves the organisation. Many service providers tend not to document things well, if they are allowed to get away with it, as this helps keep effort and FTE (therefore costs) down. There is natural staff churn of anywhere between 5% and 20% per year in typical companies, depending upon culture, rates of pay and opportunities. Documentation does not keep pace with lost knowledge as exit processes are usually poor in knowledge transfer.

Finally, DIY activities in the business often results in unofficial applications being adopted, especially as XaaS makes this easy to do. So put this all together and it is little wonder that organisations often do not know where their data is or even what data they have. This is a situation which brings inherent risk. If an organisation does not know where its data is, how does it protect it. If no one knows what data is help and "managed", then how is it integrated, kept coherent, kept clean and timely? how does the organisation know what it is actually spending on data or even what the value of its data is. Then there is the small matter of compliance. How does the organisation know whether it is complying. These are all data hygiene issues which need to be addressed if digitisation is going to support a Digital Business Model.

So now is the time to introduce Lean Data and make sure that Data Portfolio Management (DPM) is practiced as part of any approach to Asset Portfolio Management. (Asset Portfolio Management = Application Portfolio Management + Infrastructure Portfolio Management + Data Portfolio Management).

Lean Data principles mean that:
  • Organisations know what data they hold and manage;
  • Data is classified according to subject area and criticality;
  • Only the minimum data necessary to Add Value to the business is held;
  • Data replication is kept to the minimum level necessary to optimise business performance;
  • Data Value is determined by its utility in Serving the Customer, Supporting Essential Capability, Protecting the Organisation, Providing Insight for Business Decision Making.
Data Portfolio Management is concerned with:
  • Knowing what data is held and where it is;
  • Understanding the quality of the data;
  • Knowing what technology is used to manage the data and its overall condition;
  • Being able to address questions concerning issues such as criticality, protection, archiving, cost of management;
  • Understanding how Master Data Management (MDM) and integration occurs;
  • Knowing who has Stewardship responsibility and consumer rights for the data;
  • Regularly reviewing management actions to improve Data Value and address Lean Data principles.




Friday, 19 May 2017

Splunk - Digital Automation for CyberPunks

Last week I went to Splunk's event held at the InterContinental next to the O2 tent in Greenwich. 

This was a very well attended event and I got the impression that Splunk has now emerged to be a dominant player in the DevOps area around the automation of Operational Monitoring and Fault Analysis.

What I had not realised before going to the event, although I had coincidentally been discussing the potential the week before with a former colleague at Google's event, is that Splunk now provides a credible Security Event Management toolset for use in Security Operating Centre (SOC) activities, as well as a user activity analysis tool. In fact there were some interesting case studies focusing on building Lean SOC's incrementally.N.B. Gartner now positions Splunk as the leading vendor in its magic quadrant for SOCs.

It was also interesting to hear that Splunk now has a full scale partnering programme with other technology vendors, enabling integration with both new sources of data for exploitation within Splunk as well as value adds to Splunk, thus offering greater levels of automation.

However, Splunk was a little vague about future directions for the toolset. However, there does appear to be an opportunity around Application Cost Management and hooks into general Application Portfolio Management. This arises because to use Splunk effectively, you have to build a model of each application monitored which covers all the infrastructure (physical or virtual) elements used within an application in a similar manner to the models used in OBASHI or TBM (which are 2 similar but competing approaches to cost management) or in architectural tools such as Troux (now Planview) and alfabet used for application portfolio management.

This would enable a more integrated approach to some aspects of managing an application estate, gathering technical condition and cost information together to support continuous portfolio management.