The Last Planner System® has traditionally been associated with whiteboards, tape and sticky notes in a construction trailer Big Room. Many teams still use this process. When effectively deployed, it works as promised.
However, planning using analog methods does not typically facilitate the collection of data. With a sticky-note process for construction planning, Percent Plan Complete (PPC) is calculated manually by counting cards. Other metrics and key performance indicators are nearly impossible.
The collection and processing of data is where technology can excel — capturing data in the background without interfering with the collaborative process. Collection of data as a by-product of day-to-day work can open up a new world of insights, especially for construction teams and domain experts who are responsible for project delivery.
Auto-generating PPC reports is a simple first step in the data journey. But this also begs the question: What other insights could we gain from project data?
Data organization makes analysis possible. Relationships need to be identified. Activities need to be associated with a lane, a date and a trade. Enabling the collection and exporting of plan data into a series of structured tables makes it possible to use analytical tools to gain new insight into a project’s performance. Are some trades performing well? Some not so well? Are milestones being achieved by the dates on the schedule? Can we identify why work is running late and make improvements? If we can identify why work is not done or it is late, we have a roadmap for potential improvements.
The analytical layer is where the real magic happens. Tools like Microsoft® Power BI can provide fresh insights into team and project performance through visual analysis. For example, a trend line showing changes in team performance over time provides insights at a glance. Having the opportunity for all members of the team to drill into the data can provide additional insight and generate ideas to make improvements.
There is a tremendous opportunity to combine the expertise (skilled trade knowing what to look for) and access to simple tools (Power BI) for detailed analysis. Extensive filtering is built in. For example, a trade could quickly view their resource commitments planned for the next few weeks with a simple click.
But this also leads to more questions. What other insights are possible? If we had additional data, could we find new insights and ways to improve?
Data doesn’t exist in a vacuum. The contextual layer involves adding external information to our analysis to provide a richer understanding. This could include past performance, quantities from building information modeling and site resource data from a badging application.
Some examples:
In the past 5 years of working with customers, we’ve experienced a continuous feedback loop. We started with a few Power BI reports to show the possibilities. Once teams discover the power of data, requests are made to add new reports. In some cases, new data is required. As we create new reports, the result is new requests, often with new data requirements. This feedback loop has been supercharged due to the inherent knowledge of those who are doing the work (the last planners).
One of the most powerful aspects of data analysis is its ability to predict future outcomes. The predictive layer uses historical data and advanced algorithms to forecast future trends and behaviors. This can be invaluable for businesses looking to stay ahead of the competition, as it allows them to anticipate market changes and adjust their strategies accordingly.
Capturing transactional data (over time) is another layer of data beyond the current status of a project. This can enable teams to move beyond reporting on what is to what happened. And when transactional data is combined with historical data from other projects, the insights generated start to be predictive. For example, if we continue our current path, what will be the likely result?
ChatGPT was launched just over 2 years ago. The excitement (and fear) that followed the release has accelerated the conversation about the potential for artificial intelligence (AI) for a wide variety of applications, including construction.
To be effective, AI needs data. Lots of it. In the context of collaborative planning and Lean construction, capturing project data while we plan and work provides a great source of potential insight. But plan data can be combined with data from other related sources to provide another layer of insights.
Some examples:
Finally, the prescriptive layer is where data analysis not only predicts future outcomes but also recommends specific actions to achieve desired or improved results. This layer involves optimization techniques and scenario analysis to suggest the best course of action. For example, a project team could use this as a what-if tool to look for alternative plans based on the current project plan, commitments and historical performance data.
This is where the power of AI could be effectively applied — to be assistive and perform tasks quickly that would take humans much longer and would therefore be impractical.
We are in the early stages of AI adoption, and I expect the data-analysis-data feedback loop will continue as we learn.
The beauty of data capture is that it doesn’t have to come at a cost. Collection of data can simply happen in the background. However, capturing data is essential to enable teams to understand barriers to on-time performance and to make process improvements. Data captured for one project can provide analysis and aid in improvements during the project delivery. Data captured for many projects can generate a learning engine and help an enterprise improve for future projects.
The use of data ultimately has the potential to drive improvements in on-time performance, enhance productivity and improve profitability for all stakeholders.