By Phil Scanlon, Vice President of Sales Engineering, Asia Pacific, Japan & Middle East at Solace
A gust of strong wind was all it took to run a 400m-long boat aground and put one of the busiest trade routes in the world on pause. Serving as a vital linkway between Europe and Asia, the Suez Canal facilitates the movement of 12% of global trade that is worth well over $1trillion of goods a year. While it took six days to free the Ever Given and resume traffic in the waterway, adverse effects from the recent blockage can still be felt a month on.
As ships were forced to take a longer voyage due to the bottleneck, ports from all over the world continue to deal with a backlog of container vessels. All this has been further compounded by rising consumer demand for goods fuelled by pandemic lockdowns over the past year. With so many vessels holding millions of containers vying to pass through the canal, delays have been costly, as exemplified by the surging container freight rates.
Faced with a massive pushback in scheduling, many organizations are now stuck in a logistical nightmare as they struggle to respond to unexpected incidents like this. Not to mention, the disruption of the pandemic has further emphasized global supply chain vulnerabilities. Now more than ever, there is an urgent need for supply chains to be agile, nimble, and flexible to keep up with unprecedented levels of demand, while also mitigating the fundamental changes in consumer behavior and routes to market.
The value of real-time data
As the backbone of supply chain management processes, IT infrastructure needs to adapt to real-time data for greater flexibility in responding to changing demand or supply situations. Businesses can outmaneuver uncertainty by using real-time data to learn, optimize, and make predictions to reduce the impact of such unforeseen events.
When combined with machine learning and predictive analytics, real-time data can help enterprises in the logistics market proactively identify potential issues and rectify them before unnecessary costs are incurred. For instance, Unilever‘s Virtual Ocean Control Tower provides teams with highly accurate, continually updated data on the status of each shipment at every stage of the journey, and proactively flags any potential issues — be it port congestion, additional inflicted charges, or temperature deviation — along the way.
In the case of the Suez Canal blockage, real-time data can be used to help maritime companies better determine whether their ships should wait it out or make a detour around Africa. Up-to-date insights are even more valuable in the aftermath, as companies rush to reschedule shipments and seek out alternative delivery methods for customers.
World-class ports are also leveraging real-time data and automation technologies to drive efficiency and resilience. Plans are underway for Singapore’s upcoming Tuas Port to cater for future growth in container-handling demand and meet the needs of shipping mega-alliances.
Moving from batch-based to real-time data movement
As the value of real-time information diminishes with each passing millisecond, organizations need to be equipped to act on opportunities as fast as they can. This is especially important in the global supply chain, where there are many stakeholders involved, and a significant amount of data that needs to be processed by each and every player down the line. That’s why it’s concerning that some organizations are still relying on batch-based processes for data movement, where data is left stranded in stagnant ponds and retrieved only when another system asks for it.
With so many events occurring throughout the supply chain, companies not only need to react to events as they happen but also have the power to do scenario planning and develop action plans for incidents before they happen. Traditional integration solutions such as polling and point-to-point integration will no longer suffice. This calls for a truly real-time, agile, and responsive architecture that can stay resilient and agile in today’s volatile environment, such as an event-driven architecture.
An event-driven architecture is a way of building enterprise IT systems that lets loosely coupled applications and microservices produce and consume events. In this context, an event is anything that happens within and to an organization. An event-driven architecture offers huge benefits in scalability, flexibility, and agility that organizations would not get with traditional integration solutions such polling and point-to-point integration. So much so that analysts predict that over 50% of companies will be participating in an event-driven digital ecosystem by 2022.
When handling maritime logistics, for instance, an event-driven approach can be used to optimize route planning, bunkering, as well as port operations. Having real-time data move between devices, buildings, assembly lines and more, can help transportation and logistics service providers monitor and manage assets on the move through the unpredictability of traffic and weather. Insightful decision-making can also be led through the adoption of an event-driven digital twin — virtual simulations that continuously analyze and model approaches and routes in accordance with everything happening across and to their business at any given moment.
Moving forward with an event-driven approach
The continuous, real-time data flow between IT and OT systems is critical for a fully integrated and digitized supply chain that will bring about greater situational awareness, real-time responsiveness, and informed decision-making.
If there’s anything the Suez Canal blockage has shown, it is that time is of the essence — supply chains need to be nimble and develop rapid responses to shifting landscapes. Overnight, batch-based processing and planning will not be able to stand up against changing tides. Rather than hoping for the optimal conditions to arise, businesses need to start digitizing their supply chains with up-to-date insights to reduce the impact of events like this to a minimum.
Solace is a middleware company based in Canada. It provides message-oriented middleware appliances and software that routes information between applications, devices, and user interfaces.