There are some telling lessons for today’s larger organisations by looking into the history of the organisation. Big data is not a new phenomenon – it is entirely relative and all too strikingly familiar. Over time individuals and organisations have been constantly challenged by what seemed to be mountains of data containing a prospective valuable nugget of knowledge. In a recent McKinsey Quarterly article, Big Data in the age of the telegraph, Rebecca Rosenthal looks at the example of Daniel McCallum and the New York and Erie Railroad. Identifying the pain that ‘although the telegraph’s speed made more information available, organizing and acting on it became increasingly difficult,’ Rosenthal explores how McCallum sought to deal with both the deluge of information and also the inherent need to have it available where it could acted upon the most timely fashion. He reversed the typical organisation chart and devolved the capability to take action to the place where it could most effective. As Rosenthal asserts, “McCallum gained control by giving up control, delegating authority to managers who could use information in real time.”
This is a light read but a deep think that makes one consider how important information systems must be modeled on the real world circumstances that they are trying to assist in – re-affirming that the much ballyhooed ‘Big Data’ is a matter of collecting because we can, a multitude of sensors, deluging us with data that we have yet to create the proper information systems to turn it into operational knowledge and get it where it needs to be to have impact.