14 April 2015
Part of the problem is that bland terms such as ‘human factors’ and ‘ergonomics’ actually hide the true value of the subject, and lead to a prevailing but erroneous belief that human factors is only about designing control panel layouts.
A system should not just be viewed as the lines of code or the shiny hard metal and plastic. The human element is as much a part of the system as any other component. Perhaps another reason why engineers can often be sceptical about ‘human factors’ is the counter-intuitive need to focus on the little things. However, it’s a multitude of small things that can add up to an increased risk of human error – accidents rarely happen as a result of one particular action so it is important to pick up the things that could trip up human performance and deal with them as part of a systematic process.
BMT Reliability Consultants’ Human Factors Specialist, Chris Greenbank, argues that in complex engineering programmes, people are in fact, like any other part of a system – they have a certain specification, they have design parameters and environmental limits. He highlights how a structured and systematic systems engineering approach to Human System Integration (HSI) can help not only simplify the process and reduce risk, but allow stakeholders to reduce total-life costs of their assets.
Human error is cited as a causal factor in the majority of incidents and accidents across the marine sector. Between 2000 and 2005 an average of 18 ships a day collided, grounded, sank, caught fire or exploded every single day. This cost the insurance industry alone US$4 million every day – of which around US$2.7million of losses per day were attributed to human error. Over that period issues in which people played the dominant part cost the industry a staggering US$10 Billion in largely avoidable losses.
Despite these statistics, the industry still faces an uphill struggle to truly get to grips with the ‘human element’. Perhaps this is because the terminology is too vague. There is as much point in referring to the ‘human element’ as there is in referring to the ship as the engineering element – neither are specific enough to allow the area to be considered in enough detail and be meaningful. Engineering is about precision. Unless you manage your engineering programme in a very structured and systematic way, then it’s hard to see how you can succeed. The integration of the human into a system should be considered in exactly the same way, rather than the scattergun approach which remains common today.
Considering Human Factors in a systematic way improves safety and performance and drives down the total programme life cost. After all, human behaviour is a key part of almost any system, and is at the centre of triumphs - and most disasters. Considering the human element systematically is therefore essential, but it also makes business sense too. Accidents of any type are expensive, and anything that drives performance also has the potential to drive competitive advantage.
Although there is a cost to considering human system integration properly and early enough in the acquisition process, that cost is paid only once, and there is mounting evidence that HSI programmes deliver an astonishingly good return on investment with the cost being recouped by savings at all phases of design and indeed the total life cost of the vessel. Where an operator needs to compensate for poor design that cost is paid every day – and even then there is a degree of uncertainty about whether the correct response will be made when it matters. Studies from the air traffic control industry indicate that getting the design right to start with saves up to 100 times the cost of fixing problems later.
The Human System Integration process (HSI or Human Factors Integration (HFI) in the UK) is a structured process common in military and other safety critical industries that systematically allows the human element to be managed from early design through the total life of the system.
As part of the structured approach the human element is then broken down into a number of domains. Although the formal domain titles vary slightly depending where in the world you are, they are consistent in the themes they address. Here we refer to 7 domains: Manpower, Personnel, Training, Human Factors Engineering, System Safety, Health Hazards and Social and Organisational. The themes in the US for example are consistent, but the titles differ slightly. The key point is not the definitions of the domains - it is the necessity to break the human element down into manageable chunks if it is to be dealt with on a practical level from a designers and operators perspective.
Manpower plays a critical role in any given situation and so the cost of personnel is significant. Therefore it’s important for organisations to feel confident that they have the optimum number of people with the right set of skills to do the job effectively and safely and identify what opportunities there are to reduce manning levels supported by the latest technology.
To achieve optimum system performance we must remember that although there is a basic build standard for humans, individuals vary in their experience and other human characteristics, including body size and strength. Still we see designs where control panels or critical valves are positioned where they cannot be reached by everyone. Designing equipment with a detailed understanding of body size, and how it moves, as well as what people can realistically see from their height and location will result in enhanced performance and fewer errors.
In general too much reliance is placed on training and procedures as a means to mitigate fundamentally poor system design. Sometimes, training and procedures are the only tools we have and indeed, it’s not to say that training is not required. However, if we as an industry want to bring the human error rates down we have to recognise that training is one of the weakest mitigations for human error, and design the wider system and checks and balances accordingly.
People are extremely adaptable and will cope with bad design/equipment or with the most horrendous conditions, but that doesn’t mean that they’re doing well – it just means they are coping. During design, considerable time is spent with the integration of ‘unchangeable’ components of the system. They have a known specification, interface requirements and limiting factors and have to be used as they are. The same is true of humans. If you treat the human in a similar systems engineering approach then it is far easier to integrate HSI into engineering programmes. The reference data is different, but conceptually the approach is very similar.
When human element issues are put in engineering terms they often become more easily explained and managed. For example, you don’t ask an electrical circuit to take more load than it was designed for because it will break. The same logic is true of people when it comes to lifting equipment or stores. Equally, it is common to find attention quickly given to the necessary cooling of equipment to keep it within its cleared limits, yet people have similar environmental limits too but these are often overlooked – with potential performance and safety impacts.
The person is as much a part of the system as the lines of code or the shiny hard metal and plastic. That includes leaving space for them to be in the necessary posture for maintenance tasks. If you put a person in an uncomfortable position and ask them to carry out an intricate task then it is reasonable to expect the job will not always be done as well as imagined - regardless of the amount of training and procedures you put in place.
Small things that might be considered obvious are still routinely missed. HSI therefore de-risks the design early and helps avoid costly re-design work later on in the design programme.
It is something of a paradox that the human element of the system is a causal factor in the majority of accidents, yet is also one of the most widely cited safety mitigations. Indeed it is true to say that too much time is spent discussing the purely negative aspects of being human; the human in the system is in fact one of the best and most effective mitigations for a wide range of failures and accident sequences. The capacity for creative problem solving has saved the day on so many occasions, yet those events are rarely newsworthy and therefore feature less highly in people's perceptions of human risk.
It is important to approach discussions on the probability of a given human outcome with a robust knowledge base, logic and a clear head. Of course, it’s impossible to cater for everything - it’s about identifying what is an acceptable level of risk and putting measures in place to help minimise those risks. Even where equipment cannot be modified during the design process it remains important to understand any deficiencies and confirm that appropriate mitigations are in place.
Linked to system safety, it’s important to go through the process of identifying and addressing conditions inherent in the operation or use of a system (e.g. vibration, fumes, radiation, noise, shock, etc.) that can cause death, injury, illness, disability or reduce the performance of personnel.
This particular domain is inextricably linked with all of the other areas and is the application of tools and techniques from organisational psychology, management studies, social science, information science and the system of systems approach in order to consider the organisational configuration and social environment. That is everything from measures to increase retention, job design and effective communication – even the selection of people.
Managing human factors in a structured and coherent way is vital for success – be precise, thorough and rigorous. Experimenting around the edges or tinkering in the middle will not bring to the fore the total life cost and performance benefits. A systematic and structured approach to human system integration will help to minimise the opportunities for error and provide the best possible chance of optimising the performance of your key assets – your systems/equipment and your people.
 The Human Element: a guide to human behaviour in the shipping industry. The stationary office 2010. ISBN Number 9780115531200
 Wiener, E.L. (1988). Management of Human error by Design. In: Human Error Avoidance Techniques Conference Proceedings, Society of Automotive Engineers, Inc.
 Eurocontrol (1999) Human Factors Module A Business Case for Human Factors Investment Report No. HUM.ET1.ST13.4000-REP-02
Andrew Aldrich, Dr Soma Maroju
BMT has traditionally operated in the upstream production sector, connecting with both major oil company operators and large engineering, procurement and construction companies (EPCs). Data has always been an important asset across the oil and gas field development life cycle, from initial reserve assessment right through to decommissioning.
Freyja Lockwood, Dr Chris Mobley
In the digital economy, cities are a key setting for the collection of the data that enables modern technology and service firms to thrive, from sharing economy platforms like Uber or Airbnb, to providers of the technology that power local services. But as the lines between human agency and smart agent-like devices become increasingly blurred, there’s an emerging challenge in building trust and public acceptance.
The requirement to reduce maritime emissions in the next decade has brought this reality ever closer; it is not something to drop into the pending tray. If we do not make plans now, we stand little chance of even scratching the surface of the targets.
BMT's Simon Luck discusses the value of deploying digital visualisation tools in the defence domain to satisfy specific, existing training requirements