Tuesday 15 May 2018

Why Partnering and Sourcing Need to Managed in Digital Enterprises

The simplest form of sourcing in a digital enterprise is to use open source code. These days, modern IT shops use a high percentage of open source code because it raises productivity significantly. This is important because HTML5, CSS, Java, javascript, Ruby etc. are all fairly low level languages. Used in their naked form, they are like hand tools and limit productivity. Once combined with 3rd party Open source code, a development shop's productivity rises significantly and developers can focus on delivering new business value rather than "reinventing the wheel" and developing capability which already exists.

There is a downside, however, as use of third party open source code can incur other issues. Traditionally, most businesses focused on IPR ownership and the fear that someone else might end up owning the applications that they were investing in. These days the focus has shifted to 3 key issues:

  • Technical Debt,
  • Security Vulnerabilities,
  • Software Entropy.

Technical Debt describes the issues that are typically buried in code and which make it buggy and difficult to support. These may arise from use of legacy code conforming to old standards (or languages), poor programming style and structure, lack of documentation, lack of flexibility to change and programming errors or bugs. To some extent this can be seen as Agile Development's dirty little secret. A philosophy which prioritises working code over documentation as well as speed to market, can lead to poor practice and the inevitable problem of technical debt. Inevitably, quite a bit of the Open Source Software which is available does suffer from Technical Debt. So it is important to manage the way in which Open source Software is acquired to ensure that it is sourced from reputable organisations, some due diligence is used to validate its quality before adoption (as often there are multiple sources of equivalent software) and to ensure that the latest version is being adopted.

Security Vulnerabilities are almost inevitable in software development. There are so many ways that software can be attacked, it is often difficult for developers to conceive or cover all of them when developing applications. Open source is no exception. So again it is important to validate that adopted software is secure and that appropriate versions are deployed, as often it is possible to acquire multiple versions of the same opensource product or component, and one version may fix vulnerabilities recently discovered in an older version. Recently Equifax tripped over this problem dramatically and incurred severe business disruption by using older insecure versions of Open Source code, when fixes were available.

Software Entropy describes the state where multiple versions of the same software product or even multiple products from different open source developers are used in the same application to perform the same task. This causes problems because it not only introduced unnecessary complexity, but it will probably introduce uncertainty into how the code works in different parts of the application. Additionally, if the different versions and products require different component libraries, there will often be an impact on application performance and environmental stability. It seems obvious not to do this, but if all the developers in a project team are free to download and use what they want, it is almost bound to happen.

So a well managed development shop needs someone to be responsible for sourcing Open Software components, validating their quality and security, and tracking where and how they are used within the application (and across the business's) estate of applications. This allows for effective remediation and change when new versions are released and keeps the code as simple, supportable and secure as is practicable. There also should be benefits too, if the relationship with open source software providers is managed to understand what other innovations they support, either directly with other products and new releases, or indirectly through partners and industry alliances who develop complementary capability.

This takes us to wider partnerships. If you are trying to disrupt the market, it is often necessary to partner with other organisations. This may be to gain access to information you don't have or it may be because they do something which you don't but which your customer values. Partnering allows you to bring new value to the table for your customers, but many organisations fail to do this, because they aren't mature in developing partnership approaches. I remember being really astounded when I was told by an offshore services provider that it normally takes 10 years to develop a partnership with its strategic customers. Why would any customer wait that long?

There are some simple pointers to partnership: 

The first is to understand what problem you are trying to solve for your customers. Then it is relatively easy to identify who would make the right sort of partners to develop;

The second is to culturally align a joint team with your chosen partners, if they are interested in addressing the same things as you together. Cultural alignment involves many things but should include mutual goals, a shared policy of openness (within the scope of your alliance), principles by which you will make joint decisions and common values for the team.

The third is to use design thinking within the joint team to gain insight into the customer problem, recognising that multiple viewpoints, a focus on customer value and experience, and the patience to experiment will lead to more successful products and better leverage of your combined capabilities.

Finally, all participant organisations should recognise the need to commit to managing their partnership like a join venture business. This means agreeing strategy, committing resources, reviewing progress and applying just enough governance to manage the partnership for value. 

Sourcing and partnership management are often overlooked, but comprise a significant discipline which contributes to execution. As one CEO once remarked "ideas are ten a penny, but execution makes innovation succeed".







Friday 11 May 2018

Business Risks and Digital

Raconteur has an interesting article on Cyber Risk in its most recent edition which focuses on business risk.

What was interesting to me was the graphic from successive business surveys on the top 10 uninsurable business risks.

In 2007, just before the big economic downturn, out of the top 10 uninsurable risks, three of them could be seen as involving IT and an organisation's "digital business model". They were:

  (1) Damage to Reputation/Brand,
  (2) Business Interruption,
(10) Failure of Disaster Recovery Plan.

The numbers in brackets show the position in the ranking so (1) for example is top as the first most frequently cited risk. As we have seen in several cyber breaches and other instances where regulatory compliance around protecting consumer data has broken down, IT and Digital are fundamentally linked with reputation and branding and the others are quite obvious.

By 2011, as businesses were fighting to come out from the economic downturn, the top 10 list had evolved somewhat to include:

  (4) Damage to Reputation/Brand,
  (5) Business Interruption,
  (6) Failure to Innovate/Meet Business Needs,
  (9) Technology System Failure.

Last year, 2017, as AI began to take over from Big Data and IoT as the Digital flavour of the month, the top 10 had morphed into:

  (1) Damage to Reputation/Brand,
  (3) Increasing Competition,
  (5) Cybercrime, Hacking, Viruses, Malicious Code
  (6) Failure to Innovate/Meet Business Needs,
  (8) Business Interruption.

I have included increased competition because digital business models are based upon both disruptive product offerings and global reach, so increasing competition.

The prediction for 2020 from the survey also includes Disruptive Technologies, presumably some of which would be digitally enabled.

Additionally there has been a constant theme of legislative change, which usually has IT and digital implications. The other interesting fact on the graphic was the top three risks which risk management experts believe that businesses habitually underestimate in terms of impact: Cyber Incidents, Business Disruption and New Technologies.

Overall, not only are IT and Digital issues assuming great business importance, but addressing them has to be built into an organisation's business model.


What Happens To Your Business Model Under GDPR

GDPR comes into full force this month following a 2 year period for transition,in which organisations were meant to adjust operations, systems and data to comply.

Central to this is the need for explicit consent to collect and hold someone's data. The data must also be limited to that which is necessary and must not be used for any other purpose than that for which it was collected.

This has major implications for many organisations whose business is based upon consumer knowledge, such as credit rating agencies. Much of their data has historically been collected from multiple sources and aggregated. Gaps have often been interpolated and many associations have been assumed. According to a raconteur article, one industry analyst claims that people in the industry believe it to be only 50% accurate!

This probably explains, why some of the credit agencies have been keen to let people sign up and check their profiles so that they could "clean their credit history". In fact the people have been paying to correct their own data! So the credit agencies have been paid to obtain a free service, improving the quality of their data.

It is difficult to see how this model will continue in the future. As consumer trust in online use of their data is dropping rapidly and has been one of the factors why many millenials have "divorced facebook".

Another impact of GDPR is that the scope of what counts as personal data is now wider and includes things like cookies and url links. Without express permission these cannot be collected or used for things such as driving targeted advertising to your browser. This is going to affect the business model of companies like Google who rely extensively on advertising revenue.

How it all pans out will be interesting as there are other ways in which advertising can be targeted, but at least there may be more semblance of control over what happens.

Though perhaps the best benefit will be avoiding the flurry of junkmail for car insurance and house insurance when the anniversary of a purchase is imminent and traditionally insurance agreements are renewed.

Thursday 10 May 2018

The Perils of the Digital Strategy Anti-Pattern

I started writing the "Way of DAU" because of my experience with a few CxO individuals. The problem was that they saw Digital Adoption of Big Data and IoT technologies as a sort of business elixir which would magically provide them with new insight and enable them to change their business models.

Increasingly these individuals became more impatient as they expeced IT to magically "deliver the goods" and transform their businesses. They talked far and wide at business forums and in the press of how transformational digital technologies are and they decided to develop big enompassing digital strategies. The trouble was, they never actually articulated where the value lay or what the actual value proposition was. So it was difficult to understand from a technological perspective, where to start with credible opportunities and business cases which might succeed. Trying to separate genuine opportunity (which justified the investment) from fluff was impossible.

Two years later they were still waiting to implement anything tangible and still talking, whilst other companies were starting to pass them by. Little wonder then that Thomas Davenport and George Westerman published an article in HBR on why Digital Strategies Fail. Basically, companies like GE have gone full tilt at implementing digital business models and capabilities without stopping to identify and prioritise value, and expecting to get everything right first time.

Unfortunately this is the problem with innovation. It's difficult to get things right at the first attempt. Just look at Dyson, it took him thousands of experiements and years of development to crack the physics behind the bagless vaccum cleaner and he had the advantage of knowing what problem he wanted to crack. When trying to introduce ground breaking new digital products, there's a lot needed to understand what the real problem is before you try to optimise the solution. Then, as a former CEO of Intel said, you may still need to pivot in the market place to find where the real sweet spot is and fulfill the dreem of market disruption.

George Westerman has even gone so far as to say in a new article in the MIT Sloan Business Review that your business does not need a Digital Strategy, it just needs a business strategy which takes cognisance of Digital Opportunities. He talks about Building Leadership Capability, Not Abandoning Digital Transformation to IT, Avoiding Siloed Thinking and Not Pushing the Envelope too Far.

These are some good common sense starting points and pre-requisities for focused action. Let's hope that the C-suite start to listen.

Wednesday 9 May 2018

The Home of Now

As kids, we used to dream of a home that was computer controlled and did everything for you. In the SciFi version you called your home by some girls name and told it to do stuff like open your curtains. This is no longer science fiction with Alexa and Google Assistant. Sol what can you do to make your home smart, using off the shelf technology?

Quite a lot it appears and its not an irritating smart fridge. You can:


  • Control your lighting to create moods, optimise costs, make it appear you are home. So for examle you can make sure that it helps you get to sleep and wake up naturally.
  • Control your heating on a room by room basis to ensure that there are no cold spots in the house when it is occupied, but you don't heat empty rooms. Apparently this can save up to 15% in winter heating bills.
  • Automate your curtains to open when you want or on demand.
  • Co-ordinate home security with smart door bells (and in built cameras), night vision cameras, internal and external cameras, smart sensors etc. all remotely controlled and accessible from your phone or pad, so even when you are not at home you know who is knocking at your door. Some sensors use machine learning to distinguish between normal patterns of say noise and ab normal ones to avoid false positives.
  • Adapt dumb devices to become smart, e.g. with the wemo maker.
  • As well as run your entertainment around the house.

But it does not stop there. There are devices for monitoring, interacting with and remote feeding of your pets. So you can check on them whilst you are at work or out for dinner at a restaurant.
There are even smart garden products which you can use to grow herbs and other plants indoors all the year around. Using LCD lighting, thermostats, ph meters etc. they can automatically water, light and grow the plants, at 3 times normal rates to ensure that you always have a small verdant box of plants in your home.

So what's the missing ingredient? To my mind its the robot helper. A lot of companies have tried to crack this one with so called intelligent hoovers and other home buddy type robots. There's also the trend towards trying to make sexbots. But so far, these have failed to live up to the expectation or hype and come with price tags which are far in excess of capability or benefits. 

The foundations for the home of the future have been laid, but we are not quite there yet. That is not to say that we cannot get benefits from whats there with better energy efficiency, improved home security, access to all sorts of information and entertainment, automation which could help disabled and infirm people and the ability to monitor our pets, children and elderly relatives whilst we are away from home. It's just that we still have some way to go.

Wednesday 2 May 2018

Whats the Next Dominant User Device?

Figures produced by Statista show global sales of Smart Phones since 2007 following the pattern of a classical S curve and plateauing last year (2017) at around 1.5 Billion units per year. 

If one considers that the average person will replace his or her device every 3 years or so (if they ignore their service provider trying to lock them into a replace every 2 years cycle), then it suggests that the market is saturated.

Indeed, Canalys's figures suggest that smart phone shipments in China dropped last year, especially at the upper end of the market. This reinforces the recent news that Apple is not shipping as many of its new iPhone X models as anticipated.

So where is the attention of consumers going and does this signify a new battleground for consumer devices and internet access?

Tablets are certainly not taking over. Last year's figures indicated a 20% global drop on sales by the top 5 vendors to around 200M units p.a. continuing a slip which appears to go back to 2014. In fact, Statista's figures for combined PC, Laptop and Tablet sales show significnat declines and are not forecast to increase either, suggesting combined sales of around 400 million p.a. which is still dwarfed by Smart Phones.

Figures for wearable technologies, smart watches, smart glasses, rings etc. are growing but are only at around 100 Million units p.a.

Smart speakers, such as Alexa and Google Assistant, do not appear to be the answer either. Although fast growing and central to the home automation market, Canalys puts 2018 global sales projections at only around 56M and to be honest smart digital assistants are already available on smart phone platforms.

Firesticks can be discounted too. They are really only focused on smart TVs and too niche to shoulder the whole burden.

Augmented reality glasses or mixed reality smart glasses are still kick starting after Googles original concept fired the imagination but failed in public. Google is back again with a refined, less clunky version, but so are a slew of other contenders: Microsoft with its Hololens (holographic, gesture driven windows 10), Vuzix with the Blade and Alexa integration, Lightwear and Meta 2 just to name a few. However global sales are yet to reach the 1 million units p.a. mark.

So where is the market going and who is going to capitalise on it? this is an interesting question as it implies the next dominant driver of digital technology exploitation as well as which companies will profit.

Although IoT will end up connecting billions more devices than consumer devices, the influence is likely to be lesser as there is less money to be made there. 1nce for example, is targeting a price point of 10 Euros per device for 10 years connectivity support. As the economics don't support higher prices. Sparce data is the name of the game, not high end functions and complex data sets.

Human Computer integration is coming, but not yet here. Again this is interesting as it appears to be catching up rapidly with wearables, so there may be some imminent convergence over the next 4 years or so.

So is fragmentation into niche applications slowing growth, or is this a pause before the next integrating concept comes along and what will it look like?

Group Technology For DevOps

One of the often overlooked concepts absorbed by the umbrella term of Lean is Group Technology. The concept originated in Russia, was refined in the UK and then seized upon by North America in the 1960-1980s period, especially when robot cells and flexible manufacturing were the fashion in Manufacturing Systems.

Basically the concep was simple. Old fashioned manufacturing factories used to be laid out in "functional" organisations. So, say, all the Lathes were in one corner, the drilling machines would be in another part of the factory and the milling machines in a different part or even a different building.

A component might require that: 

  1. bar metal was cut to size - done by the sawing section;
  2. it was turned to shape - done by the lathes section;
  3. some holes were drilled - done by the drilling section;
  4. a flat surface was milled on one side - done by the milling section;
  5. one end was polished extra smoothly - done by the grinding section;
  6. before final inspection, packing and shipping.
Each step would be conducted under the aegis of a different foreman. Each foreman had his own priorities and was tasked with running his section efficiently. At each step, the appropriate machine would have to be set up, perhaps with a special jig as well as, with its specific set of cutting tools.

So the process was a long series of queues and delays for set up, with only a small portion of time spent actually cutting metal. Efficiency was assure by having as much work as possible queued at each section, so the machinists never ran out of work.

Process times were long, delivery dates were never guaranteed to meet promised delivery dates, but it appeared efficient, even if  lots of money (working capital) was tied up in work-in-progress inventory. It was great for the foremen, because they were never individually to blame if something was late.

Group technology, was originally focused on reducing setup times. It involved using a classification system to identify parts which were similar to manufacture, e.g. small long thin round round parts or large short squarish parts, and batching them together for manufacture at the same time.

Then the idea of the group technology cell was introduced. All the machine required to make small round parts were organised into one part of the factory, in the normal sequence in which they might be used, creating a flow line. So similar parts were all sent to the same cell and put under the control of a single foreman who was also responsible for delivery performance. This allowed parts to be batched and pipelined along the natural flow line to meet both effectiveness and efficiency needs. It also meant that there was only one queue, the one going into the cell and work in progress could be reduced. Adding TQM techniques into the mix, so that the foreman was responsible for quality as well, turned him into a mini production manager and enabled quality to be improved. Then, where practicable, automation was introduced to raise overall performance further.

So what does this mean to IT operations. Well most organisations have normally organised so that the unix/linux administrators sit in one corner reporting to a senior administrator. The NT administrators sit in another corner with their senior administrator. PC administration sits under desktop support. DBAs sit in their own little getto, with a senior DBA. The network administration people might be in another building. If global outsourcers are involved, then corners become "Competency Centres" often geographically dispersed.

In traditional "on premise" or "in data centre" operations, a large part of the time of administrators is focused on the box shifting aspects of the role with manual administration of builds and maintenance tasks. No one team is responsible for the support of an application. So when things go wrong, different disciplines point the finger at each other, saying "our bit works it must be theirs which has gone wrong". This makes reacting to incidents slow and innefficient, and slows down releases of new features. So what appears efficient is an inhibitor on value by slowing change and reducing uptime..

Modern "Everything as a Service" (XaaS) environments offer new opportunities. The job can be refocussed from box shifting and manual "oiling and adjusting" type tasks to a higher level of "End-to-End" application administration. This requires administrators to be re-organised into either application facing (if it's large like an ERP or core banking system) or business facing (if there are lots of linked smaller applications used by a business area) teams. They can then work with application and business facing development and test teams to meet the continuous delivery needs of modern digital products and applications. Tooling such as Splunk or Oracle's Management as a Service are good at pinpointing root causes of problems, ensuring that the team tracks down causes quickly and focuses on jointly fixing issues. 

That way, accountability lies with the team and incidents can be resolved quickly, whilst new releases are jointly planned to optimise value to the business. This is Group Technology for DevOps.