Design Thinking – Formalizing Innovation

Innovation has never been as important as it is now, with a market full of asymmetrical competitors and disrupters none innovative enterprises face extinction. innovation has become too important to be considered an organization pastime that employees do in their free time and after they are done with their daily tasks and so many attempts have been made to structure and formalize the innovation process, turning it from a serendipitous hit or miss act to a more structured reliable repeatable business process. Design Thinking has been around since the 60s yet it has been gaining prominence lately with the formalization of innovation trends, organizations such as Procter & Gamble and Pepsi have adopted this methodology and used it to drive innovation internally.

Design Thinking methodology has a broad implementation spectrum, with its design subject ranging from packaging and merchandise to much less tangible such as user experience and business processes.

In enterprises innovation is usually conducted through experimentation, controlled divergent or convergent experiments based on the limitations of the organization with carefully assessed predefined metrics, this analytical sterile nature of experimentation allows for slow progressive improvement of the products but breakthroughs nature do not fit this approach. Design thinking on the other hand is a bridge between intuitive thinking and analytical thinking, it relies on emotions as much as it relies on numbers and completely overlooks the inherited limitations of the current capabilities, what really sets Design Thinking apart though is how it is human centric starting with empathy and including rapid prototyping and experimentation to validate innovative ideas.

For every 1 percent of sales invested in design, profit rose 3 to 4 percent for five years. [1]

One of the ways to implement design thinking is through the 5 Stage-Process proposed by Stanford’s d.School, the process is iterative and includes the following stages: Empathise, Define, Ideate, Prototype and Test. These steps are non-linear and can be iterated over and over again until a refined acceptable product is produced, For instance you can jump back from the Test Phase to the Define phase to understand why the produced prototype was the correct solution for the problem, alternatively you can jump forward from the Empathise phase to the Prototype phase directly skipping both the define and ideate steps.

5808b55608af6

Empathise:

Design Thinking is a human centric methodology and hence it starts with empathy, this allows the designer to get a deeper more personal insight of the problem at hand, this phase include coming up with the personas and understanding their experience and their world view.

Define:

In this phase the problem gets defined as a problem statement in a human-centric manner, this requires some discipline and builds on the first stage for instance rather than defining a design problem as : “reducing water wastage in residential accounts” it should be defined as “Residential water consumers should be equipped with the tools and information to reduce their water consumption”. This phase results in phase 3 Ideate.

Ideate:

In this phase a potential solution is proposed, there are several techniques to achieve that such as Brainwrite, brainstorm and worst possible idea. The main idea here is not to be limited by the organizations limitations and capabilities.

Prototype:

A cheap easy to build prototype is produced in this phase, this prototype should include the essence of the idea and be possible to reproduce at scale. This phase also provides a reality check to the process as limitations and capabilities are explored in this stage.

Test:

Once the prototype is produced produced it should be tested by a sample representing the target user. The prototype can be refined and retested until satisfying results are reached.

This 5 stages approach is flexible and can be tailored to fit the problem at hand, the approach can vary based on the setting and the organization as well given how some organizations are more forgiving and failure tolerant than others.

 

[1] Howkins, John (2003). The Creative Economy: How People Make Money from Ideas. The Penguin Press. pp. 121–122.

[2] https://www.interaction-design.org/literature/article/5-stages-in-the-design-thinking-process

[3] Herbert Simon, Sciences of the Artificial (3rd Edition), 1996: https://monoskop.org/images/9/9c/Simon_Herbert_A_The_Sciences_of_the_Artificial_3rd_ed.pdf

Advertisements

Best Practices Development

In some psychological experiment the test subjects were placed in a room with a light bulb and a console full of keys, buttons and levers. The test subjects were told that the light bulb would blink when a very specific set of actions are done on that console and promised to be rewarded for the number of times they got it to glow. The test subjects would then try several approaches and combinations of buttons and levers until they got it to blink and repeated that pattern until at some point reached the conviction that the pattern was discovered. Almost all of the test subjects left the room believing they had cracked the pattern and knew exactly how to get that light bulb to blink.

The light bulb was actually set up to randomly blink with no relation what so ever to the console in front of the test subjects.

This research was part of a study on Operant Conditioning and its impact on superstition and how the human brain creates associations with potentially unrelated things based on the available inputs even when not nearly enough inputs are available to form a valid association the human brain finds a way and associations are created. Examples range from how red cars are perceived to be faster than black ones to how secure certain passwords are.

In my opinion – and this is an opinion peace – Operant Conditioning is how enterprise best practices are formed, over the years certain practices result in success for reasons that might or might not be related to the practices being the reason behind said results or not, and gradually the enterprise gets conditioned to believe that these practices are guaranteed to generate positive results which explains how some of the best practices I’ve encountered in my career make little or no sense at all, after all best practices can be alternatives defined as superstitions.

Design or Develop

A while ago I was explaining the architecture of a solution I have developed when he made a remark about how my solution was based on the integration of off the shelf Solutions (OTS) rather than a custom development and he added how a custom development would have been more impressive and efficient not to mention superior in every way. Being a solution architect that’s a question I have to answer at the beginning of almost every project I am responsible for and the answer is almost always customize and integrate, in this post I’m going to explain the logic behind such as decision.

  1. Someone has done it better: you might have access to the best developers but unless you are developing something within the core technology of your company you don’t have the accumulated experience someone else has especially if that solution is within their core technology area. When you limit yourself to in house developed solution you also limit yourself to the skills you’ve got within the confines of your company.
  2. Is it really custom development: unless you are writing machine language or using your own compiler you are actually customising the only difference is the building block size. Developers use libraries and packages that someone with more experience has built and most use it as a black box without fully understanding it’s inner workings and they really shouldn’t.
  3. Cost: in theory you can build everything if you have no cost and time constraints however unless it’s a toy or personal project cost and time constraints are of key importance. The more customization requires the more costly in terms of time and money a project becomes. Licenses cost money but almost always they are cheaper than the investment needed to build the same functions in house. There is also a hidden costs associated with the involved risks and the bugs that we’ll be discovered during QA and initial customer experience.
  4. Accumulated experience: vendors building OTS solution amount the combined experience of their customers which means they’ve covered more use cases and edge cases than the your in house development team ever will this experience is merged into the patches and product updates.
  5. Compliance: it’s a lot easier to comply with certification criteria when using larger building blocks (OTS) as the certification bodies are familiar with the OTS solutions and the vendors are usually precertified. Trying to certify an in houe developed solution can be both expensive and time consuming.
  6. Integration best practices: this point is a bit tricky to explain when in house solution is used the developers use integration short cuts to save time that are often far from the industry best practices, using OTS solutions enforces a certain level of best practices conformity this is valuable as a at a later stage components can be swapped out at a minimum impact.
  7. Operability & Extensibility: Even if you have the ultimate bespoke solutions built specifically for you, finding people to operate it or build on top if it in the future might prove to be problematic especially if the original team who built it moved on.

So these are the reasons why you should almost always go for customization of OTS rather than in house solutions and using larger building blocks with micro services, how about the situations where you should in fact build your own solution and going for the smallest building block possible (language default libraries)? Thats my next post.

The Indian Aadhaar Biometric Miracle

I belong to a generation that has forsaken the pen for a keyboard and hence the skills required to maintain a static signature is long gone. Personally I’ve never trusted identity verification by matching a squiggly line on a piece of paper, it just doesn’t feel accurate enough, Trying to visually match someone’s appearance to a tiny black and white photo on a government issued identification is even less accurate. Luckily biometric verification is slowly replacing such old none accurate methods, already many countries adopted them for border control however very few countries aren’t using biometric methods past the point of entry. In comes the Indian Aadhaar project, In few years the Indian government was able to accomplish nothing short of a miracle, collecting the biometric details (fingerprints and iris scans) of more than 1.1Billion citizen a feat that reportedly required a $1Billion investment to achieve. Then gradually introducing biometric verification for many of the daily functions to replace the existing less than accurate methods.

The relatively low price of the fingerprint scanner allowed it to become a ubiquitous and cheap identity verification tool, it is reported that around more than 20 Million checks are done everyday thats around 30000 request per minute (12 hours day), one can only wonder about the SLA requirements for that system or the architecture they are using to conduct that, the checks are all done in real time over public web services. In India such a system is paying for itself as the government was able to close the subsidy leakages that were being dispensed to ghost citizens whom existed only on paper. This has been applauded in an economist article stating that India has leapfrogged every country except Estonia. This is an inaccuracy as Saudi Arabia has already implemented such a system successfully, in Saudi Arabia phone lines can only be sold after an online fingerprint verification done at the shop selling the line, this is done through an online web-service in realtime.

However both the Indian and the Saudi programs are far from perfect, the finger print verification technology has certain challenges that both programs have inherited not all people have clear fingerprints and often people who engage in manual labour have finger prints that can’t be verified, More importantly online verification requires a prevalent internet access to work which isn’t always suitable for remote areas. The first problem can be resolved as iris scanners become cheaper and easier to use as iris imprints are much more accurate and stable than fingerprints. The internet access however is a challenge that can only be resolved by a huge infrastructure investment which in my opinion is the largest hidden cost for such a program.

I wonder if we are going to see this implemented in Egypt any time soon. The digitization of the identity verification.

MicroCule, A Love Story

Often I find myself in a situation that requires a quick and dirty custom micro-service to support one Proof-Of-Concept or another, with a wide array of NPM available module I usually resort to nodejs and in few minutes I’d have a nodejs script that does exactly what I wanted until just few weeks ago the only option I had for hosting that service would have been hook.io, a free micro-services hosting service that provides decent yet none stellar performance QoS such as with any cloud based free service often things didn’t work at the required performance level and sometimes the nodejs module I wanted wasn’t available on the server, however other than starting my own app engine and installing all the containers and all the associated hassle I had to make do with whatever was generously offered by hook.io.

In comes Micro-Cule,  Micro-cule is a Software Development Kit and Command Line Interface for spawning streaming stateless HTTP microservices for any programming language or arbitrary binary.

Installed on my amazon micro instance it doesn’t take more than one, yes ONE command to spawn a fully qualified micro-service to do my bidding, and here is the fun part it supports 20 different languages, so basically its like self hosting a very lean easy to manage middleware that can turn any of your hacked scripts into webservices. Micro-Cule is part of hook.io project but it offers the core service without the whole hook.io array of bells and whistles which I think is a very smart move from hook.io people, given that most of the potential users would just want to run their own webservices rather than offer a webservices hosting service.

I’m in love with microcule given how it has liberated me from heroku, google apps, amazon lambda and the even more cumbersome self hosted solutions and For all means and purposes i think microcule is the perfect webservices hosting solution for prototyping, testing and development, perhaps even production with some careful configuration.

node js script to send email over SMTP

I’ve had this challenge for which the only solution was building an nodejs webhook adapter for my smtp server. Using webhooks and the following script I was able to enable API access for my email server (which it lacked).

 

 

const SMTPConnection = require(‘smtp-connection’);
var options = {
host: ‘smtp.gmail.com’,

port: 465,
secure: true,
// requireTLS: true,
ignoreTLS: true,
debug: true
};
var envelope = {
from: ‘<user>@gmail.com’,
to: ‘badr42@gmail.com’,

};

var auth = {
user: ‘<user>@gmail.com’,
pass: ‘<pass>’,

};
var message=”hello world”

let connection = new SMTPConnection(options);
connection.connect(function(err, connect){
if(err){
console.log(“error connecting”);
console.log(err);
}else{
console.log(“attempting to login”);
console.log(connect);

connection.login(auth, function(err, connect){
if(err){
console.log(“error logging in”);
console.log(err);
}else{
console.log(“logged in”);
console.log(connect);
connection.send(envelope, message, function(err, result){
if(err){
console.log(err);
}else{
console.log(“email sent”);
console.log(result);
connection.quit();
}
});

}
});

}
});

Whats a Business Architect

TOGAF divides enterprise architecture into 4 domains: Business, Data, Application and Technology. The most illusive to define being the business architecture domain, the business architect role is often reduced to that of a business analyst’s thus missing the added value a business architect can bring into an organization. A business architect is the most important role when it comes to the future of the organization, as he works on the “what” part of the future while technology most work on the “how” part. A business architect also serves to bridge the gap between the different silos making sure there is a wholistic vision that incorporates all the needs of the organization and the IT strategy to match it.

Lately I came across what must be the best business architect’s role definition in “Be the Business : CIOs in the New Era of IT” in this must read for any aspiring CIO the author defines a business architect as someone who architects the future vision of the business; someone who looks at where the overall enterprise is going. Business architects need to be able to think strategically, but equally as important, they need to make that strategy actionable. She goes on to list what she looks for in business architects, “They can think conceptually, abstractly, and they speak the language of the business. But I’m also looking for people who have a systems architecture background, so that they understand how systems work together. It’s a tough skill-set, and because of that, we augment the team with some outside resources.”

I believe as more organizations become more technology centric and with the unique vantage point IT has across all silos and domains the role of the business architect is going to gain prominence over the next few years as a cross functional blended executive who is able to both represent business within technology circles as well as representing technology within business circles.