Smart-building technology can optimise customer experience and safety, while reducing risks and costs. Grosvenor Engineering Group Managing Director, Nicholas Lianos, explains that the key to achieving these goals is the collection, compilation, analysis and utilisation of building-generated data.
As customers are becoming more demanding, buildings are becoming smarter. Building owners are investing in a broad spectrum of innovative technologies to provide a market-place differentiator. Often incorporating artificial intelligence (AI) and machine learning (ML), these innovations are helping to meet customers’ growing expectations for elevated levels of comfort and convenience—which also serve to improve productivity—while reducing both operating costs and risks. However, to bring smart buildings to life, these services need to link to the rest of the hard technical services of the building if their full potential is to be realised.
There are many examples of these embryonic technologies. Individual thermal and illumination control, for instance, empower occupants to control both the temperature and the lighting of their own workspace. This results in happier, healthier and more-fulfilled tenants, who are able to enjoy a heightened sense of control over their environment. They can tailor their own lighting and temperature to meet their personal preferences or mood, or more importantly to perfectly suit the task in which they are engaged.
Interestingly, studies indicate that the provision of individual control tends to result in the overall reduction of lighting levels, rather than an increase as we might reasonably expect. Here, aside from the benefits to comfort, individual controls will help improve energy efficiency, reduce heat generation, and also extend lamp life.
The increasing use of AI assistants provides another good example of an emerging technology delivering benefits to building owners and tenants alike, especially where they are utilised to streamline the human interface for individual control of temperature and lighting. Just as we have become accustomed to using voice commands to interact with our mobile devices and home-automation solutions, AI assistants similarly now allow occupants to interact with technical services within their workspace to a greater degree than ever previously possible.
Building occupant environment monitoring can also elevate occupants’ sense of wellbeing, by analysing measured workspace parameters—such as ambient noise levels, humidity, lighting, temperature, and various air-quality parameters including CO2 levels, particulate counts and Volatile Organic Compounds (VOC) concentrations. Employee productivity can be enhanced by correlating performance against these factors, which can be engineered to create the ideal working environment.
Another innovative concept gaining prevalence is ‘co-workspaces’, which are work areas shared between different organisations to optimise building utilisation. The spaces will be able to identify the user and adjust facilities to their needs with automatic billing based on duration of occupancy, facilities used, time of use and demand loading. The technology will allow co-workspaces to be set up as a seamless experience for both owners and occupants, using two-way communication of multiple systems, such as facial recognition, access control/tracking and space people counting.
Digging out the data
Each of these innovations has the potential to improve the occupants’ safety and comfort, and reduce building costs. However, for maximum benefits to be achieved, they need to be incorporated into the overall building-technology strategy, and a key part of how succesfully this works in practice is down to the data.
‘Data lakes’—filled with well-structured data—will act as the enabler to extract maximum value from these evolving innovations. A building-technology strategy utilising data in this way will be well placed to identify where the ‘technology gaps’ exist that are directly linked to business drivers, and formulate a plan to close them.
Over time, the data of an asset will become an integral part of its value, just as today we consider a vehicle with a full service history more attractive than a similar vehicle without. However, to reach its maximum potential an asset’s data needs to be accessible. It cannot be stranded on an isolated server, but needs to be unlocked through the use of emerging communication technologies, such as Big Data Analytics, Internet of Things (IoT), Digital Twins, Machine Learning (ML) and AI. To complement this machine data, man-made data—originating from servicing, observations and other human/machine interaction—also needs to be valued, ordered, stored and integrated as part of a holistic data-capture process.
With so many options now available, the selection of new technologies needs to be carefully considered. Those that generate data that has little intrinsic value or which muddy the waters need to be avoided. However, new technologies that generate data that offers enhanced insights to fill knowledge gaps—either in isolation, or in conjunction with other data streams—should be embraced, as this will help generate new levels of value.
For data sets to be correctly structured and maintained, the set up, structuring, renewal and maintenance needs to be undertaken by domain experts who have a comprehensive understanding of each data stream. To ensure ongoing innovation and cost-reduction of data handling, the asset owner also needs to ensure that the way the data is set up actively promotes competition between suppliers.
A forward-thinking building-technology strategy should incorporate two key elements. Firstly, it must be able to release the ‘trapped’ value of existing data-generating infrastructure. And secondly, it must demonstrate a willingness to embrace any new technologies that are able to unlock previously untapped data streams, in such a way that the combined value to the business of the existing and new data is greater than the sum of its individual parts.
To be viable, the emerging technologies need ‘good’ well-structured data to train the infrastructure. Over time, the continued flow of good data will increase the value of the asset, irrespective of changes of contractors (acting as data creators) or of the assets themselves changing owners. Furthermore, feeds from existing and future sub-contractor need to be utilised to generate ‘good’ data from maintenance, repair, refurbishment, control and monitoring activities.
Roadmap to realisation
The structuring of data is the difference between a value stream to the business and a data swamp. Value is created when analysis of well-structured data unveils new insights that lead to better decisions in the future. Once data structures have been created, it is imperative that they are maintained to reflect changes within the asset. Each change should refresh existing data in the data lakes to ensure ongoing value generation.
Good data supports the creation of knowledge that promotes pro-active action, leading to the delivery of outcomes that meet key business drivers. These include improving the customer experience, enabling more flexible workspaces, enhancing tenant retention, lowering operational risks and costs, increasing asset values, promoting sustainability and even achieving carbon neutrality.
As part of a building-technology strategy, there needs to be a roadmap that incorporates a gap analysis. The purpose of this is to map out future tenant and occupant drivers, evaluate and quantify stranded digital data sets, assess man-made data, identify data roadblocks, and review new proven data-generation technologies that are able to both generate and liberate the trapped value.
For the best results within a large complex organisation, a methodical approach needs to be established on a trial site that is representative of the portfolio, and an owner-specific smart building transition process developed as part of this trial. The site should be assessed for all current systems and their data-generation streams, which need to be mapped out to identify where the gaps exist. These will include man-made, machine, and unstructured data generation points, the latter including areas like social media. A high-level evaluation of each item will identify which data sets can enhance business outcomes and are therefor worthwhile of inclusion. The results can be sorted by ‘business value versus cost’ to establish a program of data integration projects in relative importance.
Smart-building structural data issues then need to be addressed. As smart-building solutions are long-term propositions, a broad stakeholder engagement will prove invaluable to ascertain the issues that affect each part of the organisation, such as IT, Valuation, Operations and Asset Management. The findings of the trial site can be used to establish a smart-building gap analysis methodology and transition plan. This can then be applied across the entire portfolio.
The biggest challenge for establishing an effective building-technology strategy is how the data is handled. Data structuring and tagging is a critical element of realising value from the data sets, and the organisation doing it must have an in-depth appreciation of the often-complex connections that exists within each data source, such as the inter-relation between man-made, energy/water and machine data sets.
Moreover, a building-technology strategy is not a static entity, as each asset within the building is changing constantly. An effective strategy therefore should be equally dynamic—almost organic in nature. The maintenance aspect is often overlooked when data sets are established, but data upkeep needs to accommodate the evolving nature of buildings and the upgrades that occur within them. In this way, the data will grow as the building changes, providing the knowledge to facilitate future building and occupant requirements efficiently and effectively.