Cave Rescue: A lesson in leadership

Cave rescue.jpgJuly 10, 2018

The following weeks will likely reveal the intricacies of the rescue of the youth soccer team from the Chiang Rai cave in Thailand.  But from the outside, it seems to me that the operation is a clear example of sucessful leadership in a complex environment.

  • The area’s acting Governor Narongsak Osatanakorn, personally took charge of the operation, quietly and effectively,
  • He quickly assembled a team of specialists from many different areas: speleology, health, metereology, parent relations, etc.
  • The team invited outside experts and volunteers into the process but clearly remaining in control,
  • The team evaluated alternative strategies for the rescue, taking input from every discipline,
  • Meanwhile, and without skipping a beat, a gigantic logistics effort went underway, procuring and deploying all types of equipment and people from pumps to ropes, oxygen, ambulances, food, as well as personnel and goods to support the primary teams,
  • The top team established policies that were stronly implemented:
    • Privacy comes to mind.  Photos of the rescued kids inside or outside the cave may have been taken but they will not likely be released until all are sucessfully out.  The press was not allowed near operational areas.
    • Parent relations seems to be handled very well.  All parents are on board with the team’s approaches and policies.
    • There is a news blackout of sorts, to the benefit of rescuers and victims.  Politicians are not exploiting the trajedy by parading in front of cameras.
  • Finally, the team has shown the ability to take advantage of changing circumstances by adapting the rescue plan when advantageous.  For instance, the second rescue operation was implemented hours earlier than originally planned because oxygen was resupplied faster than planned.
  • This is a clear sign of good leadership:
    • A tactical plan is in place to support the mission.
    • The core team is in constant communications, discussing and re-evaluating execution of that plan, and
    • The leader is constantly updated and is willing to take decisive action.

      Bravo!, a lesson to be learned!

Advertisements
Posted in Decision Making, Government, Leadership | Tagged , , , , , | Leave a comment

Complexity, Reliability And Cost

Reprinted from Semiconductor Engineering 

JUNE 14TH, 2018 – BY: ED SPERLING

Fraunhofer EAS’s top scientist digs into new technical and business challenges shaping the semiconductor industry.

Posted in Decision Making, Products | Tagged , , , , | Leave a comment

The Guardian view on internet security: complexity is vulnerable

Reprinted from The Guardian 

A huge weakness in wifi security erodes online privacy. But the real challenge is designing with human shortcomings in mind

This week’s security scandal is the discovery that every household with wifi in this country has a network that isn’t really private. For 13 years a weakness has lurked in the supposedly secure way in which wireless networks carry our information. Although the WPA2 security scheme was supposed to be mathematically proven to be uncrackable, it turns out that the mechanism by which it can compensate for weak signals can be compromised, and when that happens it might as well be unencrypted. Practically every router, every laptop and every mobile phone in the world is now potentially exposed. As the Belgian researcher who discovered the vulnerability points out, this could be abused to steal information such as credit card numbers, emails and photos.

It is not a catastrophic flaw: the attacker has to be within range of the wifi they are attacking. Most email and chat guarded by end-to-end encryption is still protected from eavesdroppers. But the flaw affects a huge number of devices, many of which will never be updated to address it. Since both ends of a wifi connection need to be brought up to date to be fixed, it is no longer safe to assume that any wifi connection is entirely private.

The story is a reminder of just how much we all now rely on the hidden machineries of software engineering in our everyday lives, and just how complex these complexities are. The fact that it took 13 years for this weakness to be found and publicised shows that no one entirely understands the systems that we all now take for granted. Also this week, a flaw was discovered in one of the widely used chips that are supposed to produce the gigantic and completely random numbers which are needed to make strong encryption truly unbreakable. Even the anti-virus systems that many users hope will protect them can be turned inside out. First the Israeli and then the Russian intelligence agencies appear to have penetrated the Russian-made Kaspersky Anti-Virus, a program of the sort which must have access to all the most sensitive information on a computer to perform its functions.

And then there are the known unknowns: the devices which most users do not even notice are connected to the net. It is estimated that there will be 21bn things connected to the internet by 2020, from baby monitors and door locks to cars and fridges. Billions of these are unprotected and will remain that way.

But this kind of technological failure should not blind us to the real dangersof the digital world, which are social and political. The information about ourselves that we freely give away on social media, or on dating sites, is far more comprehensive, and far more potentially damaging, than anything which could be picked up by a lurking wifi hacker. The leak of millions of user accounts from Equifax, the credit reference agency, is only the most recent example of the plundering of personal information by criminals.

Such hacks might be regarded as the outcome of technical wizardry, but are dependent on human shortcomings in recognising and fixing security flaws. Others would be impossible without tricking real users out of their passwords first. In criminal hands, social engineering beats software engineering every time, and the problems of the internet cannot entirely be solved by technical means. Until we design for human nature, no perfection of machinery can save us.

Support the Guardian

Posted in Products, Society | Tagged , , | Leave a comment

How Network Complexity Killed Water Cooler Collaboration

by Grant Ho | NetBrain Technologies

Reprinted from The VARGuy.com

Effective collaboration could once be defined as hanging around the company water cooler discussing the latest network issues and emerging trends. Thanks to the ever-increasing complexity levels and scale of today’s networks, however, these casual conversations can no longer be classified as an effective method of information sharing. As network size and complexity increase, so do network teams. Enterprise networks are no longer operated by small teams in a single location, but teams of varying technical skills and diverse geographies. If network teams want to be on the same page as the rest of their IT counterparts with the ability to respond quickly to a network issue, new forms of collaboration and information sharing are required.

Out of sight, out of mind?

The new era of collaboration requires an effective strategy that ensures vital information is shared across teams. However, in a recent NetBrain survey, 72 percent of network engineers cited a lack of collaboration between teams, specifically network and security teams, as the number one challenge when mitigating an attack. Due to increasing network complexity, these teams have become more siloed, making ongoing communication difficult. This becomes problematic when a network outage arises and teams don’t know how to jointly respond as they have little to no experience working together. The result? Hours wasted on communicating issues that should be standard procedure rather than swiftly addressing and repairing the problem.

Many network teams are combatting this issue with a multi-phase approach to improve collaboration, process and tools. When it comes to the network, automation is a critical enabler for all stakeholders by providing the ability to share domain expertise and operational data during network problems.

Democratize knowledge

The simplest form of collaboration is knowledge-sharing. This means making sure that everyone tasked with managing the network is equipped with the appropriate information to perform their job optimally. While it seems simple, the approach can be a significant challenge for any enterprise network team.

Today, teams struggle to document and share knowledge as the process is time consuming and tedious. This limits the ability to scale as critical network information is often stored in the brains or hard drives of tribal leaders who have worked on a specific network for many years. The domain knowledge is far too deep. While tribal leaders have spent years honing their skills and learning the ins and outs of their networks, organizations can be at an advantage by ensuring more network engineers are equipped with similar levels of information. For instance, what happens when a busy, senior Level-3 engineer isn’t around to troubleshoot a network outage? Democratizing her best practices so that more junior engineers (i.e., Level-1 and Level-2 engineers) can diagnose the problem, instead of waiting and escalating all the way to the Level-3 engineer, can result in quicker response times and better SLAs.

Streamline data sharing

While sharing best practices is critical, collaboration is more than just a clear picture of how to do the work. Sharing is also crucial at the task level where insights and conclusions should be made as a team. However, organizations often struggle with this process—many network teams communicate via email or web conference, and here, data sharing becomes cumbersome and comes in log files or data dumps.

Drawing key insights and actionable decisions from a data dump is difficult. Even if an individual has the right insight he or she needs for the task at hand, it can be time consuming and tedious to work through. These manual methods of data collection and sharing (e.g., box-by-box, screen scraping or legacy home-grown scripts) result in slower troubleshooting and a longer mean time to repair (MTTR). Take the example in a typical network operations center. Here a high degree of redundant work can happen as Level-3 engineers often have to repeat the same tasks as Level-2 engineers, and Level-2 engineers have to do the same with Level-1 engineers. The culprit is largely a poor flow of information disguised as incomplete documentation at best and incorrect documentation at worst. Instead, by providing network teams with a common visual interface—for instance, a map of the network’s problem area—they can access the most relevant data while utilizing shared insights to accelerate decision-making.

Security through collaboration and automation

While collaboration is critical to network troubleshooting, it becomes particularly essential when the network comes under attack. During a security incident, the network team typically works with the security team, the applications team, and related managers. With so many stakeholders involved, centralized information becomes imperative. That’s why it’s critical to democratize best practices and seamlessly share information to drive shorter repair times and better proactive security.

Again, automation plays a key role. For instance, by automating the creation of the exact attack path, network and security teams can quickly get on the same page by gaining instant visibility into the problem. Moreover, when diagnosing the problem, automating best practices contained in off-the-shelf playbooks, guides, and security checklists is essential. Digitizing those steps into runbooks that can be automatically executed—and capturing runbook insights so they can be shared across network and security teams—results in faster responses and less human error. As shown in the graphic, these runbooks can then be enhanced with lessons learned from the security event to improve responses down the road. As networks are increasingly at risk, organizations that learn from the past to improve their future will be at an advantage when it comes to mitigating future threats.

The bottom-line is that the scale and complexity of networks is changing how organizations respond to network issues and security threats. Automating critical data-sharing will foster better collaboration and results than the water cooler ever did.

About the Author

Grant Ho is an SVP at NetBrain Technologies, provider of the industry’s leading network automation platform. At NetBrain, he helps lead the companies’ strategy and execution, with a focus on products, events, content and more. Prior to joining NetBrain, Grant held various leadership roles in the healthcare IT industry and began his career as a strategy consultant to wireless and enterprise software companies. You can follow Grant on Twitter @grantho and you can follow NetBrain @NetBrainTechies.

Posted in Decision Making, Modeling, Society | Tagged , , , | Leave a comment

Complexity and AR (Augmented Reality)

I have commented on this blog on several occasions about the fact that we are living in an increasingly complex world.  Fields of knowledge, from medicine to social sciences, to many others, are ever-expanding.  According to a 2013 IBM article by ,  2.5 quintillion bytes of data are created every day. This data comes from sensors, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few.  Connections and linkages between data points, the true source of complexity, are also expanding.  Google can link people to places through their phones, and how long they stayed at each place.  Amazon, Facebook, and Google understand people and their interests based on data they collect through interactions.

The proliferation of data and how they relate to one another makes it more and more difficult for us to find and understand what we need to know, and how to make sound decisions.  We need tools to help us take advantage of this data and inform and educate us.  This is where AR (augmented reality) comes in.

AR is a relatively new concept seeking to overlay digital components on top of q real scene.  This can be done through viewing glasses or a screen where objects or information is presented on top of a live or still view.  Large companies like Facebook, Google, Apple and Microsoft are each embracing this general idea with different objectives and perspectives.

I came across postings by Luke Wroblewski (LinkedIn), a product director at Google. Luke has begun to describe a conceptual approach to AR where the digital overlays are designed to serve specific functions by leveraging contextual data (in this instance, data known to Google.)  In this example, the AR algorithm would have (this is a mock-up) recognized that the driver needs to find a gas station. The AR platform would then overlay the respective price differential and the distance of alternative stations to the one in sight.  To me, this is a great example of how AR can help manage complexity.  The AR platform would distill the inherent relationship between cost and distance.  This relationship between cost and distance is at the heart of the “complex” decision that the driver must take:  What are the risks of driving further to save money?  Do I have time? Do I have enough gas in the tank?  Do I really know how little gas is in the tank?

Wroblewski and Google are onto something here.  “Representation, analysis and scientific visualization (as opposed to illustrative visualization) of heterogeneous, multi-resolution data across application domains” to quote Meyer Z. Pesenson at Al. in their paper “The Data Big Bang and the Expanding Digital Universe: High-Dimensional, Complex and Massive Data Sets in an Inflationary Epoch.”

AR may be more than a hammer in search of a nail. It may be a new conceptual approach to help us deal with big data and its complexity.

Posted in Decision Making, Products, Society | Tagged , , , , , | Leave a comment

Insight into Brain’s Complexity Revealed Thanks to New Applications of Mathematics

Re-posted from the European Union CORDIS

The lack of a formal link between neural network structure and its emergent function has hampered our understanding of how the brain processes information. The discovery of a mathematical framework to describe the emergent behaviour of the network in terms of its underlying structure comes one step closer.

© Chesky, Shutterstock

Insight into brain’s complexity revealed thanks to new applications of mathematics

The need to understand geometric structures is ubiquitous in science and has become an essential part of scientific computing and data analysis. Algebraic topology offers the unique advantage of providing methods to describe, quantitatively, both local network properties and the global network properties that emerge from local structure.

As the researchers working on the Blue Brain project explain in a paper, ‘Cliques of Neurons Bound into Cavities Provide a Missing Link between Structure and Function’, while graph theory has been used to analyse network topology with some success, current methods are usually constrained to establishing how local connectivity influences local activity or global network dynamics.

Their work reveals structures in the brain with up to eleven dimensions, exploring the brain’s deepest architectural secrets. ‘We found a world that we had never imagined,’ says neuroscientist Henry Markram, director of Blue Brain Project. ‘There are tens of millions of these objects even in a small speck of the brain, up through seven dimensions. In some networks, we even found structures with up to eleven dimensions.’

As the complexity increases the algebraic topology comes into play, being a branch of mathematics that can describe systems with any number of dimensions. Researchers describe algebraic topology as being like a microscope and a telescope at the same time, zooming into networks to find hidden structures and seeing the empty spaces. As a result, they found what they describe in their paper as a remarkably high number and variety of high-dimensional directed cliques and cavities. These had not been seen before in neural networks, either biological or artificial and were identified in far greater numbers than those found in various null models of directed networks.

The study also offers new insight into how correlated activity emerges in the network and how the network responds to stimuli. Partial support was provided by the GUDHI (Algorithmic Foundations of Geometry Understanding in Higher Dimensions) project, supported by an Advanced Investigator Grant from the EU.

For more information, please see:
CORDIS Project website

Source: Based on project information and media reports
Posted in Modeling, Society | Tagged , , , , , | Leave a comment

Howard Raiffa, Harvard Professor, decision analysis pioneer, dies at 92.

From the Harvard Gazette:

Howard Raiffa, the Frank P. Ramsey Professor Emeritus of Managerial Economics, died July 8 at his home in Arizona following a long battle with Parkinson’s disease.  Raiffa joined the Harvard faculty in 1957. With a diverse group of Harvard stars that included Richard Neustadt, Tom Schelling, Fred Mosteller, and Francis Bator, Raiffa would form the core of what would be the modern Kennedy School (HKS) in 1969, and played a central role in the School for decades as a teacher, scholar, and mentor. Together with colleague Robert Schlaifer, Raiffa wrote the definitive book developing decision analysis, “Applied Statistical Decision Theory,” in 1961. He also wrote a textbook for students like those at HKS, and a simpler, popular book on the subject.

“Along with a handful of other brilliant and dedicated people, Howard figured out what a school of public policy and administration should be in the latter decades of the 20th century, and then he and they created that school,” said Raiffa’s longtime friend and colleague Richard Zeckhauser, Frank Plumpton Ramsey Professor of Political Economy.

“Despite his great accomplishments as a teacher and scholar, those who knew Howard well treasured him for the generosity of his spirit, his great warmth, and his desire to always be helpful, whether fostering cooperation among nations, choosing where to locate Mexico City’s airport, or designing a curriculum for teaching analytic methods.”

This combination of work marks Raiffa as a model for the Kennedy School: His scholarly analysis advanced experts’ understanding of many important questions, and he also knew how important and valuable it was for him to speak to the broader world.  In particular, he recognized that the methods he had pioneered and mastered could be helpful to people with much less sophistication, and he reached out to help them.

“Howard was a giant in the history of the Kennedy School and a towering figure in the fields of decision analysis, negotiation analysis, and game theory,” said HKS Dean Douglas Elmendorf. “All of us who are associated with the Kennedy School are greatly in his debt.”

More on Howard Raiffa:

Video | Posted on by | Tagged , , , | Leave a comment