AWS Rekognition + OpenCV

Here is how you can use OpenCV with AWS rekognition

  1. Enable your configured aws account on rekognition and S3
  2. Create a rekognition collection this will be the collection of faces the camera stream is matched against > #aws rekognition create-collection –collection-id “faces” –region us-east-1
  3. List collections to make sure that the collection was created #aws rekognition list-collections –region us-east-1
  4. Create an S3 bucket and upload the photo with the face you’d like your camera to rekognize
  5. Index the faces in your S3 Bucket
    aws rekognition index-faces \
    –image ‘{“S3Object”:{“Bucket”:”BucketName“,”Name”:”FileName.png“}}’ \
    –collection-id “faces” \
    –region us-east-1
  6. The following python code should work
import boto3
import cv2
from PIL import Image
import io
frame_skip = 12 # analyze every 100 frames to cut down on Rekognition API calls
vidcap = cv2.VideoCapture(0)
cur_frame = 0
success = True
BUCKET = “BUCKETNAME” ### This is the bucket where the face your trying to match lives
COLLECTION = “COLLECTIONNAME” #### This is the collection of faces that you have already taught the ai to store
def search_faces_by_image(bin_img, key, collection_id, threshold=80, region=”us-east-1″):
rekognition = boto3.client(“rekognition”, region)
response = rekognition.search_faces_by_image(
Image={‘Bytes’: bin_img},
return response[‘FaceMatches’]
print(“no faces found”)
while success:
success, frame = # get next frame from video
if cur_frame % frame_skip ==0: # only analyze every n frames
print(‘frame: {}’.format(cur_frame))
pil_img = Image.fromarray(frame) # convert opencv frame (with type()==numpy) into PIL Image
stream = io.BytesIO(), format=’JPEG’) # convert PIL Image to Bytes
bin_img = stream.getvalue()
face = search_faces_by_image(bin_img, KEY, COLLECTION)
cur_frame += 1

Design Thinking – Formalizing Innovation

Innovation has never been as important as it is now, with a market full of asymmetrical competitors and disrupters none innovative enterprises face extinction. innovation has become too important to be considered an organization pastime that employees do in their free time and after they are done with their daily tasks and so many attempts have been made to structure and formalize the innovation process, turning it from a serendipitous hit or miss act to a more structured reliable repeatable business process. Design Thinking has been around since the 60s yet it has been gaining prominence lately with the formalization of innovation trends, organizations such as Procter & Gamble and Pepsi have adopted this methodology and used it to drive innovation internally.

Design Thinking methodology has a broad implementation spectrum, with its design subject ranging from packaging and merchandise to much less tangible such as user experience and business processes.

In enterprises innovation is usually conducted through experimentation, controlled divergent or convergent experiments based on the limitations of the organization with carefully assessed predefined metrics, this analytical sterile nature of experimentation allows for slow progressive improvement of the products but breakthroughs nature do not fit this approach. Design thinking on the other hand is a bridge between intuitive thinking and analytical thinking, it relies on emotions as much as it relies on numbers and completely overlooks the inherited limitations of the current capabilities, what really sets Design Thinking apart though is how it is human centric starting with empathy and including rapid prototyping and experimentation to validate innovative ideas.

For every 1 percent of sales invested in design, profit rose 3 to 4 percent for five years. [1]

One of the ways to implement design thinking is through the 5 Stage-Process proposed by Stanford’s d.School, the process is iterative and includes the following stages: Empathise, Define, Ideate, Prototype and Test. These steps are non-linear and can be iterated over and over again until a refined acceptable product is produced, For instance you can jump back from the Test Phase to the Define phase to understand why the produced prototype was the correct solution for the problem, alternatively you can jump forward from the Empathise phase to the Prototype phase directly skipping both the define and ideate steps.



Design Thinking is a human centric methodology and hence it starts with empathy, this allows the designer to get a deeper more personal insight of the problem at hand, this phase include coming up with the personas and understanding their experience and their world view.


In this phase the problem gets defined as a problem statement in a human-centric manner, this requires some discipline and builds on the first stage for instance rather than defining a design problem as : “reducing water wastage in residential accounts” it should be defined as “Residential water consumers should be equipped with the tools and information to reduce their water consumption”. This phase results in phase 3 Ideate.


In this phase a potential solution is proposed, there are several techniques to achieve that such as Brainwrite, brainstorm and worst possible idea. The main idea here is not to be limited by the organizations limitations and capabilities.


A cheap easy to build prototype is produced in this phase, this prototype should include the essence of the idea and be possible to reproduce at scale. This phase also provides a reality check to the process as limitations and capabilities are explored in this stage.


Once the prototype is produced produced it should be tested by a sample representing the target user. The prototype can be refined and retested until satisfying results are reached.

This 5 stages approach is flexible and can be tailored to fit the problem at hand, the approach can vary based on the setting and the organization as well given how some organizations are more forgiving and failure tolerant than others.


[1] Howkins, John (2003). The Creative Economy: How People Make Money from Ideas. The Penguin Press. pp. 121–122.


[3] Herbert Simon, Sciences of the Artificial (3rd Edition), 1996:

Best Practices Development

In some psychological experiment the test subjects were placed in a room with a light bulb and a console full of keys, buttons and levers. The test subjects were told that the light bulb would blink when a very specific set of actions are done on that console and promised to be rewarded for the number of times they got it to glow. The test subjects would then try several approaches and combinations of buttons and levers until they got it to blink and repeated that pattern until at some point reached the conviction that the pattern was discovered. Almost all of the test subjects left the room believing they had cracked the pattern and knew exactly how to get that light bulb to blink.

The light bulb was actually set up to randomly blink with no relation what so ever to the console in front of the test subjects.

This research was part of a study on Operant Conditioning and its impact on superstition and how the human brain creates associations with potentially unrelated things based on the available inputs even when not nearly enough inputs are available to form a valid association the human brain finds a way and associations are created. Examples range from how red cars are perceived to be faster than black ones to how secure certain passwords are.

In my opinion – and this is an opinion peace – Operant Conditioning is how enterprise best practices are formed, over the years certain practices result in success for reasons that might or might not be related to the practices being the reason behind said results or not, and gradually the enterprise gets conditioned to believe that these practices are guaranteed to generate positive results which explains how some of the best practices I’ve encountered in my career make little or no sense at all, after all best practices can be alternatives defined as superstitions.

Design or Develop

A while ago I was explaining the architecture of a solution I have developed when he made a remark about how my solution was based on the integration of off the shelf Solutions (OTS) rather than a custom development and he added how a custom development would have been more impressive and efficient not to mention superior in every way. Being a solution architect that’s a question I have to answer at the beginning of almost every project I am responsible for and the answer is almost always customize and integrate, in this post I’m going to explain the logic behind such as decision.

  1. Someone has done it better: you might have access to the best developers but unless you are developing something within the core technology of your company you don’t have the accumulated experience someone else has especially if that solution is within their core technology area. When you limit yourself to in house developed solution you also limit yourself to the skills you’ve got within the confines of your company.
  2. Is it really custom development: unless you are writing machine language or using your own compiler you are actually customising the only difference is the building block size. Developers use libraries and packages that someone with more experience has built and most use it as a black box without fully understanding it’s inner workings and they really shouldn’t.
  3. Cost: in theory you can build everything if you have no cost and time constraints however unless it’s a toy or personal project cost and time constraints are of key importance. The more customization requires the more costly in terms of time and money a project becomes. Licenses cost money but almost always they are cheaper than the investment needed to build the same functions in house. There is also a hidden costs associated with the involved risks and the bugs that we’ll be discovered during QA and initial customer experience.
  4. Accumulated experience: vendors building OTS solution amount the combined experience of their customers which means they’ve covered more use cases and edge cases than the your in house development team ever will this experience is merged into the patches and product updates.
  5. Compliance: it’s a lot easier to comply with certification criteria when using larger building blocks (OTS) as the certification bodies are familiar with the OTS solutions and the vendors are usually precertified. Trying to certify an in houe developed solution can be both expensive and time consuming.
  6. Integration best practices: this point is a bit tricky to explain when in house solution is used the developers use integration short cuts to save time that are often far from the industry best practices, using OTS solutions enforces a certain level of best practices conformity this is valuable as a at a later stage components can be swapped out at a minimum impact.
  7. Operability & Extensibility: Even if you have the ultimate bespoke solutions built specifically for you, finding people to operate it or build on top if it in the future might prove to be problematic especially if the original team who built it moved on.

So these are the reasons why you should almost always go for customization of OTS rather than in house solutions and using larger building blocks with micro services, how about the situations where you should in fact build your own solution and going for the smallest building block possible (language default libraries)? Thats my next post.

The Indian Aadhaar Biometric Miracle

I belong to a generation that has forsaken the pen for a keyboard and hence the skills required to maintain a static signature is long gone. Personally I’ve never trusted identity verification by matching a squiggly line on a piece of paper, it just doesn’t feel accurate enough, Trying to visually match someone’s appearance to a tiny black and white photo on a government issued identification is even less accurate. Luckily biometric verification is slowly replacing such old none accurate methods, already many countries adopted them for border control however very few countries aren’t using biometric methods past the point of entry. In comes the Indian Aadhaar project, In few years the Indian government was able to accomplish nothing short of a miracle, collecting the biometric details (fingerprints and iris scans) of more than 1.1Billion citizen a feat that reportedly required a $1Billion investment to achieve. Then gradually introducing biometric verification for many of the daily functions to replace the existing less than accurate methods.

The relatively low price of the fingerprint scanner allowed it to become a ubiquitous and cheap identity verification tool, it is reported that around more than 20 Million checks are done everyday thats around 30000 request per minute (12 hours day), one can only wonder about the SLA requirements for that system or the architecture they are using to conduct that, the checks are all done in real time over public web services. In India such a system is paying for itself as the government was able to close the subsidy leakages that were being dispensed to ghost citizens whom existed only on paper. This has been applauded in an economist article stating that India has leapfrogged every country except Estonia. This is an inaccuracy as Saudi Arabia has already implemented such a system successfully, in Saudi Arabia phone lines can only be sold after an online fingerprint verification done at the shop selling the line, this is done through an online web-service in realtime.

However both the Indian and the Saudi programs are far from perfect, the finger print verification technology has certain challenges that both programs have inherited not all people have clear fingerprints and often people who engage in manual labour have finger prints that can’t be verified, More importantly online verification requires a prevalent internet access to work which isn’t always suitable for remote areas. The first problem can be resolved as iris scanners become cheaper and easier to use as iris imprints are much more accurate and stable than fingerprints. The internet access however is a challenge that can only be resolved by a huge infrastructure investment which in my opinion is the largest hidden cost for such a program.

I wonder if we are going to see this implemented in Egypt any time soon. The digitization of the identity verification.

MicroCule, A Love Story

Often I find myself in a situation that requires a quick and dirty custom micro-service to support one Proof-Of-Concept or another, with a wide array of NPM available module I usually resort to nodejs and in few minutes I’d have a nodejs script that does exactly what I wanted until just few weeks ago the only option I had for hosting that service would have been, a free micro-services hosting service that provides decent yet none stellar performance QoS such as with any cloud based free service often things didn’t work at the required performance level and sometimes the nodejs module I wanted wasn’t available on the server, however other than starting my own app engine and installing all the containers and all the associated hassle I had to make do with whatever was generously offered by

In comes Micro-Cule,  Micro-cule is a Software Development Kit and Command Line Interface for spawning streaming stateless HTTP microservices for any programming language or arbitrary binary.

Installed on my amazon micro instance it doesn’t take more than one, yes ONE command to spawn a fully qualified micro-service to do my bidding, and here is the fun part it supports 20 different languages, so basically its like self hosting a very lean easy to manage middleware that can turn any of your hacked scripts into webservices. Micro-Cule is part of project but it offers the core service without the whole array of bells and whistles which I think is a very smart move from people, given that most of the potential users would just want to run their own webservices rather than offer a webservices hosting service.

I’m in love with microcule given how it has liberated me from heroku, google apps, amazon lambda and the even more cumbersome self hosted solutions and For all means and purposes i think microcule is the perfect webservices hosting solution for prototyping, testing and development, perhaps even production with some careful configuration.