We are nothing after our death. Let us donate our body organs for the poor.

Be not afraid of anything. You will do marvelous work The moment you fear, you are nobody - Swamy Vivekananda

If you think safety is expensive, try an accident... - O.P.Kharbanda

Preventable accidents, if they are not prevented due to our negligence, it is nothing short of a murder - Dr. Sarvepalli Radha Krishna, 2nd President of India

Zero accidents through zero unsafe behaviors. Do not be complacent that there are no accidents. There may be near miss accidents (NMAs). With luck/chance, somebody escaped knowingly or unknown to the person. But, we can't be safe, if we depend upon the luck.

Safety culture is how the organization behaves when no one is watching.

We make No compromise with respect to Morality, Ethics, or Safety. If a design or work practice is perceived to be unsafe, we do not proceed until the issue is resolved. - Mission statement by S&B Engineers & Consultants Ltd. http://www.sbec.com/safety/

Human meat gets least priority - A doctor's comment on accidents

CSB video excerpts from Dr.Trevor Kletz, http://www.youtube.com/watch?v=XQn5fL62KL8

Showing posts with label near misses. Show all posts
Showing posts with label near misses. Show all posts

Nov 11, 2013

Trade off between production and safety

Safety is our first concern, so say, everybody in public forums. But, there is also something called, ALARP (As Low As Reasonably Practicable). What is the measure of this ALARP. What is ALARP for me need not be the same for others. Still, we have to put our wisdom so that we do not endanger the lives of workmen, public and the environment. One should not go by apparent/immediate effects. Instead, one should see for invisible but disastrous and also long term effects.
Mostly, shop floor managers see the the visible effects whereas safety professionals see the invisible effects. Here comes the friction and safety managers will be pressurized to yield to the arguments of "there are no effects and not to imagine wildly".
Safety managers are treated as untouchables / non-entities and ridiculed at every next opportunity. It is known that every known hazard will not lead to a disaster and this gives strength to the arguments of short sighted shop floor managers and safety managers have to watch in despair.
One does not try to learn from the case studies and safety managers are shouted when pointed about similar occurrences and may be branded as brainless fellows.
But, when something happens, the black sheep will be the safety manager. At that time, again, safety managers are ridiculed for not foreseeing the hazards and that they do nothing.

The solution is to fix responsibility. If safety manager is lax, he should be responsible. If production  manager doesn't listen in spite of pointing out by safety, he should be responsible. Management should not try to protect production managers in those situations. And, every near miss/incident, irrespective of its severity, should be investigated and responsibility should be fixed. Only then, I feel that everybody understands safety and its benefits.

Nov 24, 2012

A day full of nearmisses

Today is not a good day for me with respect to my safety on the road. I started on bike in the morning to drop my child to school and we almost lost balance when front wheel of my bike went on a joint between two blocks of the concrete road. I regained my balance immediately due to reflex action of my left foot to take support on the road.

Later, when I started to go to my factory, en route, all of a sudden a school bus from my left side came closer to me to take right turn without any indication. The road is about 30 feet and there are no lanes to follow. I slowed down to allow the school bus to move past and avoid getting hit.

After 5-6 km bike ride, a car first and later a bus from opposite direction, one after another crossed the double mark on the road encroaching the road meant for vehicles moving in opposite direction. Again, I escaped when I swerved left and those vehicles immediately went within the otherside of the line. At that moment, if any vehicle coming behind me could have hit mine due to my movement leftside, but, as my luck was with me still, there were no vehicles behind.

In all above incidents, I was driving cautiously at about 30-35 km speed and thus could avoid my fall or hit against other vehicles. Still, I am unable to think what I should have done to avoid these near misses except in the first incident, wherein, I could have avoided entering into the road joint.

Two weeks ago, when I was travelling in a company vehicle to airport in Mumbai, the driver commented that one should avoid driving closer to the road divider particularly during night and early hours as it is likely that some drivers under the influence of alcohol or those driving at high speeds can loose control on the wheel and cross over the road divider and hit opposite coming vehicles travelling adjacent to the road divider.

Soon after, when I reached my place and was coming out of the airport, I saw a car landed in the thick bushes between the opposite roads, being towed away by traffic police. It appears that some rash driver could not take turn along the road curve due to high speed driving and ended up in the bushes on the roadside. The car exactly stopped at the other side of the edge of the road divider. If it moved a foot distance further, it could have hit easily any of the vehicles in opposite direction as it is a busy airport road.

The above is a lesson for me that we should not drive too close to the road divider.

Aug 14, 2012

Act on near misses instead of cleaning after accidents

In an article,  http://www.wired.com/wiredscience/2012/08/st_essay_close_calls/, it is stated that there is an 80% drop in serious accidents at Dow Chemical after the company implemented close call reporting. Though, it looks cumbersome and time consuming initially to act upon the near misses, it is an opportunity to purge the system of unsafe conditions and will be prove beneficial in the long run to the organisation. Managers should encourage their employees to report all near misses and appreciate them instead of chiding them and be grateful to the workers for reporting near misses.

Nov 2, 2011

Flirting With Disaster - some notes

http://www.flirtingwithdisaster.net/
FLIRTING WITH DISASTER
Why accidents are rarely accidental?
Marc Gerstein with Michael Ellsberg

The book is about case studies, root causes and lessons to be drawn. Following is an extract of the book useful in implementing safety at workplace.
  1. Organizations do not learn routinely and systematically from past errors and disasters – in fact, they rarely ever do.
  2. Deliberate decision of not to try to learn from accidents is an anti-learning mechanism. This is because of blame and punishment/penalty one gets after identification of mistakes outweigh the benefits of understanding what should be done within the organization to avoid such mistakes.
  3. There is strong and successful resistance within many organizations to studying or recording past actions leading to catastrophe-because doing so would reveal errors, lies, or even crimes.
  4. Many accidents are not accidents at all. They were imagined and accurately predicted. But, the alarms were ignored by those who had the power to disregard them. It is hard to grasp the scale of suffering such mistakes can create.
  5. Some saw the warning signals, but they were not voiced in such a way, or to the relevant people to galvanize them into action. Such phenomenon is called ‘bystander behavior’.
  6. Organizational bystanders are individuals who fail to take action even when important threats or opportunities arise. They often have crucial information or a valuable point of view that would improve an organization’s decision-making, but for a variety of psychological and institutional reasons, they do not intervene.
  7. Observers are not likely to act if “better-qualified” authorities or experts are present nearby.
  8. Bystander behavior is more likely to occur in organizations with strong hierarchies and rigid group boundaries that are populated with leaders lacking the ability to value, foster, and manage dissent. Such organizations are also more likely to be staffed by midlevel managers who lack the motivation or skill to elicit views that differ from those of their bosses. When those in the middle suspect that things are amiss, they tend to ignore their own concerns. Instead, they defer to others in authority, justifying their inaction by attributing greater knowledge and wisdom to their superiors.
  9. Short term thinking about money is a factor in many accidents.
  10. Dangers arise when regulators and watchdog agencies develop financial and political ties to the entities they are supposed to be regulating and watching.
  11. Regulators are morally culpable when they do not take action.
  12. Collapse of firms were result of the corrosive effects of envy, greed and divided loyalties, combined with the deeper issue of organizational culture and its role in the fostering of disaster. The consequences are severe when watchdogs become consultants to the firms.
  13. Many solutions to risk reduction involve going against the beliefs and biases. When ignored, most risks do not somehow take care of themselves, or simply cease to be an issue.
  14. Each uncorrected risk is one more accident waiting to happen.
  15. Truth will not come out in organizations which punish the offenders. Accident investigation is a fact finding mission not a fault finding mission.
  16. Many of the disasters including natural disasters are preventable. In all cases, the severity can be reduced by better planning; hard work and a mind open to the nature of risk. The question is whether we have the wisdom and the will to change.
  17. Risk versus uncertainty: Risk is associated with something going badly wrong, whereas uncertainty involves outcomes of any kind.
  18. Unknown probabilities are riskier.
  19. People see greater risk when making decisions about which they feel comparatively ignorant (sometimes, it is the other way).  The more we know the less respect we give.
  20. In the modern world, many of the unfortunate outcomes occur to other people, not to the decision-maker and his kin.
  21. Being RISK BLIND underlies most tragedies. Knowledge should be available and to be understood by the decision makers.
  22. Most of the times, technology is not always well behaved. Innovators do not fully understand the behavior of the systems they create.
  23. Emerging technology has not had the time to accumulate a substantial body of experience through use under varied conditions (i.e. not fully tested before put into practice).
  24. Engineering personnel might have a hunch about a particular risk but lack culturally acceptable proof that the risk is real. In such a situation, the organization can behave as if conditions are safe until the hunch can be verified as a real risk through further testing or a real-life accident. Conversely, the organization can assume that conditions are risky until it can be proved safe.
  25. Wishful thinking: Thinking the way it pleases us.
  26. We see what we expected to see, not what was actually there.
  27. Under pressure, people often see what they want to see, especially if their push the company and subordinates in a particular direction.
  28. Causes for an accident: i) cold causes: unintentional mistakes, although not unimportant; ii) warm causes: include ignoring weak signals of danger and other bureaucratic inefficiencies in response to indications of risk – these choices appear less innocent than many design errors because they involve decision-makers’ priorities and judgment in the face of explicit, identified risks; iii) hot causes: deliberately subordinating safety to financial and political pressure – unethical and immoral decisions – often consist of conscious decisions that may well expose people to harm without their knowledge, and certainly without their consent.
  29. Design errors are central to many accidents – not visible till tragedy strikes.
  30. Design weaknesses often fall into two categories: the obvious and the subtle.
  31. Faulty design creates latent unsafe conditions that can result in an accident under particular circumstances.
  32. Design issues are the responsibility of the management, not of the workers.
  33. People are tempted by short-term gains or coerced by social pressure, and then their risky behavior is strongly reinforced when they repeatedly get away without incident. People develop comfort with deviations that did not cause any accident/wrong behavior and forget to be afraid.
  34. Inability to eliminate recurring warning signals shows system failure.
  35. Eleventh hour meetings are generally ineffective environments in which unpopular theories with little evidence are not considered/given due weightage.
  36. Ignoring weak signals is the norm in many organizations; it occurs in business and public policy as well as in science.
  37. It is easy to find causes for an accident after it occurs, but one should find before accident occurs.
  38. Progress inevitably engenders risk.
  39. Improving safety also encourages risk taking.
  40. People rely on instrumented systems assuming they function as per design intention but the actual behavior of these instruments can vary depending upon their installation, they do not behave as expected leading to difficulties which are severe during emergencies.
  41. If responses take decades but hazards take far longer to develop, all is well. If the relationship is reversed - as was the cause during floods, then things may end in disaster.
  42. In many organizations, decisions have to be approved by higher-ups, a process that inevitably slows things down.
  43. Redundancy is often the key to risk protection.
  44. Complex systems introduce unknown failure scenarios (KISS – keep it stupid simple?).
  45. Many accidents can be traced to various faults with monitoring and control systems, information overload to the operator, inadequate training.
  46. The shift to software-intensive systems has made man-machine partnerships far more complex than we fully understand. Highly reliable technology makes people less vigilant, since human beings are not effective monitors of situations that rarely fail. Employing more comprehensive and reliable systems only exacerbates the problem. Although such systems are more reliable, they are more boring to monitor as well as more difficult to diagnose.
  47. Layers of protection include safety procedures; training programs; specialized hardware interlocks; monitors, alarms, and warnings; and various forms of containment systems.
  48. Catastrophes occur when defensive systems fail or deliberately disabled.
  49. Butterfly effect: The idea that small differences can lead to major consequences down the road and at a distance is often called the butterfly effect (small is big, monsters looking innocent).
  50. Energy conservation, would not only reduce dependence on imported oil, but it would also save consumers money and cut urban air pollution, acid rain, greenhouse gases, the production of radioactive wastes, trade deficits, and long-term defense costs of protecting oil installations.
  51. Many important dynamics take a long time to have a visible effect.
  52. Facing the choice between the short-term requirements versus the long-term needs is not an easy decision.
  53. Understanding how an organization recognizes the hazards it faces, as well as how it changes in response to those hazards, is essential to avoiding disaster.
  54. Culture consists of emergent organizational properties that cannot be separated from history, especially the actions taken by company leaders.
  55. Basic cultural assumptions are deep-level tenets that employees and members of organizations hold to be true, often without realizing it. Over time, decisions that may start out as opinions, personal preferences, or practical necessities evolve into internalized truths that become second nature throughout the organization. Organizational members who “think the unthinkable” find themselves fighting a war on two fronts: the need to prove their case, and the need to establish the legitimacy of the arguments on which their case is based.
  56. Easter Island: Easter Island, the most remote inhabited place on the earth, located in the South Pacific Ocean, not ideal for new inhabitants because of the conditions, but contains giant stone statutes. Read the story of how cultural change brought self destruction in the book (http://flirtingwithdisaster.net/easter-island_321.html).
  57. Organizational tunnel vision: People within organizations obsessed with maximizing a single metric are especially prone to being blind to other considerations. In order to keep a schedule, engineers with safety concern have to prove that their concern is valid and the scheduled activity is unsafe rather than to prove that it was safe (Program engineers may ask the safety person, ‘show me how it is unsafe’ instead of program engineers analyzing the concern and proving to the safety engineer that it is safe).
  58. Tsunami December 2004: A school girl Tilly Smith on vacation on Maikhao Beach, Thailand noticing frothing and rapid receding of ocean waters alerted her mother, as her teacher told such phenomena as signs of an impending tsunami. Her action led to saving lives of all persons in the beach. (Tsunami waves can travel at 500 miles per hour across the deep ocean).
  1. Rules for preventing and coping with accidents:
    1. Rule # 1: Understand the risks you face. Evaluate the hazards every time you face. Probabilities don’t matter once any event with serious consequence like tsunami occurs. Whatever be the probability, in the words of Trevor Kletz, “we have done this way 100 times is not acceptable unless an accident on 101st time is acceptable”. Take action assuming the probability is 100% all the time.
    2. Rule # 2: Avoid being denial. Do not neglect warning signs or ignore assuming they are silly.
    3. Rule # 3: Pay attention to weak signals and early warnings. These are a telegraph warning of possible danger. Accidents don’t just happen and are often not accidental at all. Do not take it as one time affair. Because there is a problem, something is lacking, the incident occurred. Ignorance will only lead to a serious incident next time. Ignoring weak signals is a pervasive temptation you must learn to overcome.
    4. Rule # 4: It is essential not to subordinate the chance to avoid catastrophe to other considerations. Catching plane does not mean you should drive fast on the road. Missing the plane is worthy than injuring yourself or the person on the road and miss the plane anyway.
    5. Rule # 5: Do not delay by waiting for absolute proof or permission to act. The signal may not be true and you may become laughing stock if the signal doesn’t turn out to be true. But don’t get disheartened. It is better than allowing damage / loss of lives if the warning sign turns out to be true.
(Intelligence wings issue alerts many times to the government and citizens about terrorist attacks or of similar nature and many times we do not see attacks. This does not mean that we should not believe in those alerts. It is not possible to understand complex minds of people when even we do not know what we want. Then it is much more difficult to understand the nature. It is easy to blame safety and security officials for being overcautious but you are the first person to blame them when incidents occur without realizing that you are responsible for your safety. If you do not know what to do in your house or does not know what is happening in your backyard, who are you to question others?)
  1. Don’t squander your early warnings with delays or half measures. If you do, don’t be surprised if the clock runs out.
  2. Treat near misses as genuine accidents: It is a safety sine qua non that near misses and other forms of weak signals be treated as if they were genuine accidents. They are considered “free tuition” – valuable lessons without much cost. Always pay attention as if the worst had actually occurred, but develop efficient ways of confirming or disconfirming the actual danger to minimize your time and effort.
  3. In many accidents, the bulk of the damage occurs in the aftermath, not during the event. A tremendous amount of harm can be reduced by early warning systems, defense construction, contingency planning, and rapid response. Even when the incident can’t be prevented, as is often the case in natural disasters like the tsunami, anticipation can often mitigate a lot of harm.
  4. Politics trumps safety. Here politics means one-upmanship and resultant timelines, pressures, communication or lack of it, and so on.
  5. Routine and non-routine accidents: We do not see a hazard until we experience the consequence. Many accidents occur routinely because people are irrational about danger. People are scared about non-routine accidents like anthrax poisoning, nuclear accident, flu epidemics, etc but not about routine accidents like slips, falls, road accidents, deaths from smoking/alcohol consumption which more than non-routine accidents. People overact to rare risks than common accidents.
  6. In some cases like living near ocean/volcanic or seismic zone/mountains, etc we may feel that we have no choice but to accept risk, but flirting with disaster out of ignorance or denial rather than rational choice is simply foolish.
  7. Residential fires related cooking: Home cooking is responsible for starting over a quarter of the 400,000 residential fires that cause 13,000 injuries and 3,000 deaths in the United States each year. Smoke alarms, fire blankets, and fire extinguishers as well as safe practices for deep-fat frying and other high-risk activities are sensible precautions even if they are not perfect solutions. (Last few years, we are seeing a number of fires and explosions due to rupture of gas piping in residential areas. The common reasons are, digging without authority, not closing the valve properly, corrosion, poor maintenance and monitoring, etc. Still, thousands of miles of gas lines are being laid every year and we are living with them.)
  8. The enemies of effectively dealing with low-probability risks are denial, ignorance, and lack of preparation. Denial prevents our dealing with the risks in the first place (not recognizing the hazard); ignorance constrains our choices and distorts our priorities; and lack of preparation forces us to deal with complex problems under emotional pressure and time constraints, vastly increasing the chances of bad judgment and the possibility that we well be overtaken by events. Examine the cumulative risk of all low-probability threats and make your plans according to the rule of avoiding the greater mistake. You may not always make the same choice for each risk, or the same choices as other people, but they will be your choices, made with knowledge and forethought.
  9. The consequence of minor risks will be high. A simple event will grow into a monster when we are not prepared.
  10. Moving from BYSTANDER to WITNESS to WHISTLE-BLOWER: You may not be able to question the defaulters at all times. Sometimes just “active watching”, visibly taking notes, or writing a concerned e-mail is enough to change the course of a situation. Being visible and questioning clearly inappropriate actions rather than fading into the background often makes a difference, even if it is not a decisive action. Equally important, when someone else takes stand-up action, lending visible support matters a great deal. Individual effort may not be effective, but team effort will make the wrongdoers to change their ways. Silent watching or cooperating with wrongdoers will lead to destruction of the society and the individual also, while taking action / making right noises will make the person confident, satisfied and help the society.
  11. Suggestions for Professionals and Managers:
    1. We should not be bystanders and should not encourage bystander behavior in those around us.
    2. We should all do what we can to ensure that dissent is encouraged, not repressed, and that the channels of complaint are open.
    3. We should do what we can to build viable information and reporting systems that widely disseminate risk-related performance information. According to research, when people’s actions go unrecorded, and are therefore undetectable, the chances of shortcuts under pressure rise by a factor of TEN.
    4. We should not collude in cover-ups, even minor ones. Such cover-ups may lead to increased difficulty when it becomes necessary to reveal embarrassing facts later on. Every incident should not be covered as acceptable risk.
    5. When there is a likely and recordable unacknowledged risk, each of us should assemble our allies and pursue a complaint with the appropriate institutional body. If all else fails, we should consider blowing the whistle (with documents). Most of us are prisoners of institutional realities that tolerate unacceptable risk in the name of practicality. The fallacy in most organizations is that lowering risks is unacceptably expensive. In fact, not only is it probably much less expensive than people think, over the long term it will probably save money as well as lives.
  12. Suggestions for Leaders:
    1. Realize that practicalities and shortcuts have costs that inevitably even out in time and that one’s choice is to either pay now or pay later. May be your policies will not immediately lead to accidents in your tenure and you get all appreciation for the gains that are short-term, but the organization suffers later when those of your policies lead to catastrophes in the long run.
    2. We can’t put a price tag to injuries and deaths and compensation alone is not sufficient to judge the cost.
    3. Leadership is often the originator of the financial, scheduling, or political pressures, and thus is the ultimate source of a significant increase in risk. Imposing nonnegotiable performance objectives combined with severe sanctions for failure encourages the violation of safety rules, reporting distortions, and dangerous shortcuts. Putting people in no-win performance situations encourages recklessness and fraud, inevitably increasing the chances of a major catastrophe. Leaders must therefore hold themselves accountable for the inadvertent consequences of their management philosophy and tactics.
    4. Pay scrupulous attention to design. When design is faulty, accidents happen. In organizational settings, accidents are never accidental: They are inevitably the result of faulty management, particularly the management of safety.
    5. Systemize paying attention to near misses, weak signals, and assessments of engineers and safety officials. Leaders have to create monitoring systems, systematic review procedures, and independent information channels that do not report through the operational chain of command. While safety and risk management is perfectly compatible with efficient operations over the long term, if often runs contrary to it in the short term, especially if there have been long periods of neglect.
    6. Recognize that while every organization tolerates some dissent, on certain subjects it does not. Only leaders can eliminate these “undiscussables”. Encourage whistle blowers to get timely information about risks, else bystander behavior is inevitable and affect the organization in the long run.
    7. Create effective contingency plans for serious but low-probability risks.
    8. Every organization requires robust, independent watchdogs. There is no substitute for regulatory independence and should not be measured in terms of cost of maintaining it.
    9. Leadership must subject itself to relentless review and self-criticism.
  13. Relabeling problems as opportunities can have true shift in mental framework and reap benefits to the organization.
  14. The first big mental shift is accepting the inevitability of accidents and catastrophes without giving in to them. Do not wait until after a disaster strikes.
  15. The second big mental shift is appreciating the difference between new ideas and unpracticed old ones. 

Dec 15, 2008

Importance of Learning


Accidents and near misses are the result of mistakes committed by some one.

On human errors, there is a quotation by the great safety professional, Mr Trevor A. Kletz (retired from ICI, UK and now is about 90 years old),

“Mistakes occur because someone does not know what to do. To prevent them we need better training or instruction or changes to the plant design or work method so that the task is easier”.

It can be due to shop floor person, supervisor, officer during operation, issue of instructions or during design. A mistake can be a part of the system right from the beginning or it could have been committed subsequently.

We all make mistakes. However we can become wise and knowledgeable only when we learn from these mistakes.

Experience makes a man wise. However, experience is not desirable in all aspects.

Again in the words of Mr Kletz in his book, “Still Going Wrong”,

“A high price was paid for safety information mentioned in various sources (like books, films and internet). People were killed or injured and billions of dollars worth of equipment was damaged. Someone has paid the TUTION FEES. There is no need for you to pay them again”.

It only takes one apple to spoil the whole barrel as the old saying goes, and this is time when a group of humans gather together. If one friend or family member is not happy, then their unhappiness can spread to infect the others in the group.

Same can be applied to safety. If a person is unaware / not bothered about safety, he will endanger others also.

If we see the accidents that occurred in any particular industry from its inception or even for that matter for the last few years, can we say that all those accidents that occurred were of first time in nature? It is unlikely and I am sure nobody agrees with this statement. Most of the accidents are of repetitive nature, if not exactly, slightly in a different manner. If we go through the accident and near miss records, we can find similarity in the type of accidents. May be it is not in the same plant but in some other plant under the management of same organization.

Dissemination of information is vital so that all of us can study the accidents and near misses and can see whether IS IT POSSIBLE IN MY PLANT FOR OCCURRENCE OF SUCH AN ACCIDENT. It is important in safety committee meetings held every month that the discussions should be on accidents and near misses occurred in the plant or other plants. Instead, the observation is that mostly discussion goes on pending safety related deficiencies (SRDs) which were already brought to the notice of the plant management by the safety officer and by allocating necessary resources (responsible person, man power, finance, etc), these SRDs can be corrected. Information on accidents, near misses, unusual occurrences etc is normally compiled at corporate level and are sent to all units under the organization after review. This is one valuable source of information on accidents. Similarly, we read such incidents in the newspapers that there is some fire, explosion, electrocution, fall of person, etc with some details. Therefore media is another source. Similarly those who had access for other sources like books (library) or internet can get valuable information.

Seminars can be conducted for employees in which those who were actually involved in accidents directly or witnessed can explain their experience in accidents with the noble cause of saving their colleagues from such accidents. Discussion of these in such seminars, plant meetings, etc creates an awareness about safety and can ignite our minds to take necessary corrective measures so that we can maintain a safe work place.

Featured Post

Reduced my weight from 96 to 76 kg and tummy from 38-40 to 34-35 inches in about 9 months

I am working in the safety department of a government organization. As a part of the job, I used to go around and interact with person...