Over the course of 15 years I had the good fortune to attend research conferences in the UK, Europe, the USA and also presented work in Australia. Experimental Physics was (almost) a cut-throat business where we were always trying to achieve the highest something or the lowest whatever. Competition and camaraderie went hand in hand however, to the extent that I was invited to spend time working with a research organisation in Germany who were our direct European ‘competitors’. I didn’t go in the end, but we invited one of their employees to join us for a couple of months to show us a different spin on experimental techniques and to test our kit with their samples. It was to all intents and purposes a free exchange of information, and I know from colleagues still involved in scientific research that sabbaticals and research trips are still highly valued.
Fast forward to early 2018 and I had the pleasure to attend the recent Workplace Trends Research Conference in London with my colleague Phil Muir from Space Solutions. My first ever conference to do with Workplace and my first conference whilst employed in the private sector. Immediate observations? Well there was a much better gender balance in the room to begin with compared to science conferences, but the same wildly varying quality of presentation proficiency and quality was in evidence; in this case presentations ranged from micron-sized fonts to someone dressed as an 18th-century government official.
What struck me at Workplace Trends was that there was obviously some real research behind many of the presentations – statistics, probabilities, computer modelling – alongside more subjective analyses of workplaces based on observation and a psychology-based understanding of human nature. The organisers had chosen the presenters based on an ‘author-blind’ reading of the abstracts. Judging by the introductions from the Chair, a few of the names were very well known, therefore it did beg the questions (a) how many people submitted abstracts and (b) how ‘blind’ was the judging? But I digress.
Was there the same free exchange of information as in science conferences? To some extent yes. Organisations such as Leesman are all about sharing data with anyone who is willing to read their reports (“Stop Guessing, Start Gathering”). They don’t spin the data, they just report trends. And the trends are interesting, and not unsurprising – noise is still a big issue in the workplace and quiet places for focused work are increasingly more highly-regarded, even greater than spaces designed for collaboration.
In this day and age, where clients ask, ‘but why?’ when it comes to our statements about what a new working environment should encompass (and to be honest, we consultants should constantly be asking this of ourselves) it is invaluable to have real data to back up our (often) highly subjective and woolly statements. Statistics do imply a sense of rigour and gravitas, especially when there is a footnote in a small (i.e. illegible) font to accompany the information or a reference at the end.
So why are Leesman important in this sector? They are important because they have no agenda (or at least I don’t believe they do); their main aim is to independently measure the effectiveness of workplaces. They are not furniture suppliers presenting data that, surprise surprise, states that height-adjustable work desks are the best thing since sliced bread (the jury is out on that one anyway). They are not pushing a health or wellbeing accreditation scheme, with glossy photos of biophilia-inspired offices and smiling faces. Slide after slide full of graphs and pie-charts, percentages and arrows doesn’t have the wow factor that a designers’ presentation might have. And while Tim Oldman can tell a good story and is an excellent presenter, to be honest it’s not about him. It is about the data. They are also important because the scale of their data is immense; over 250,000 data points by early 2018.
On a workplace consultancy and design practice level, we also need to generate our own data and this is where the difficulty lies. In an ideal world we would say to a prospective client “Our project with Client A resulted in a B% satisfaction level with the new office – an increase of C% from their previous environment”. This is not an ideal world however and too often clients are not willing to ask these questions, sometimes because a full post-occupancy evaluation (POE) costs money, sometimes because feedback on a project can reveal uncomfortable truths. Does it cost much to set up a workplace survey or POE? The time (=cost) is not in the setting up of the survey, but in the analysis and interpretation. Even then, is it not incumbent on consultants and designers to push for either the client to pay for it, or if they are reticent about spending more money, to fund and carry out the study themselves? Once the data has been gathered and analysed, especially if the client hasn’t paid for it, it should definitely be published for all to see (including potential clients), in the first instance on the company’s own website. In a roundabout way, it is a form of internal quality control and an indicator of a ‘lessons learned’ philosophy.
In doing the above there is a semblance of the free exchange of information that real academic research is built upon, even if it does require a bit of searching to find it and shows the data, warts and all.
Comments to firstname.lastname@example.org