We are nothing after our death. Let us donate our body organs for the poor.

Be not afraid of anything. You will do marvelous work The moment you fear, you are nobody - Swamy Vivekananda

If you think safety is expensive, try an accident... - O.P.Kharbanda

Preventable accidents, if they are not prevented due to our negligence, it is nothing short of a murder - Dr. Sarvepalli Radha Krishna, 2nd President of India

Zero accidents through zero unsafe behaviors. Do not be complacent that there are no accidents. There may be near miss accidents (NMAs). With luck/chance, somebody escaped knowingly or unknown to the person. But, we can't be safe, if we depend upon the luck.

Safety culture is how the organization behaves when no one is watching.

We make No compromise with respect to Morality, Ethics, or Safety. If a design or work practice is perceived to be unsafe, we do not proceed until the issue is resolved. - Mission statement by S&B Engineers & Consultants Ltd. http://www.sbec.com/safety/

Human meat gets least priority - A doctor's comment on accidents

CSB video excerpts from Dr.Trevor Kletz, http://www.youtube.com/watch?v=XQn5fL62KL8

Nov 13, 2011

What is safe speed on the road?

Last few days, I felt myself lucky many times on the road while going to my workplace. All the times, I feel I saved my self from some injury due to my low driving speed. There are pedestrians crossing the road all of a sudden, persons asking for lift by coming to half of the road, 2-/3-/4-wheelers suddenly coming to the middle of the road from left side, persons moving on the road while conversing on the mobile, pushcarts taking their own time to move, and so on.
Many times I felt like shouting, but what is the use other than increasing my stress and spoiling my day.
I found during these observations that the safe speed (not average speed) on the road that I can control my self from hitting others on the road is 30 kmph, not more than that. With this speed, even if someone crosses my line suddenly or the vehicle in front stops suddenly, still I can apply brakes and avoid any accident even if the other fellow on the road wants to take a hit from me.

Person suffers burns during digging

A construction labourer suffered burn injuries while working in a trench to repair a septic tank piping. The person using a power tool to remove the damaged pipe hit an LPG line in the trench and the sparks led to fire engulfing the person. The injured with difficulty came out of the trench and rolled over the grass to extinguish flames. The person who engaged the injured was penalised for various lapses.

From the description about the incident, the lapses that come to notice are,
  1. Whether any permission taken for digging where the gas lines are present. It is a widely followed safety practice to identify presence of utility lines like electrical/gas/water in the area proposed for digging and take precautions to ensure safety of the persons and property. If it is in a factory area, concerned engineer will give clearance and in case of public area, town planning engineer will give such clearance.
  2. From the photograph in the weblink, service line identification marks are not visible in the area.
  3. Absence of measures for preventing trench cave-in.
  4. Lack of proper means of access/egress from the trench like a ladder at 30 m distance intervals.
  5. Whether only one person was engaged for such work and if not what happened to other workers.
  6. Absence of supervision.
Most of accidents reported are leak/fire/explosion due to puncturing of gas lines during digging.

Nov 10, 2011

Off-Site Emergency Response Plans: A Preparedness Tool - Fire Engineering

The Bhopal accident in 1984 led to promulgation of an Act in 1986 to identify hazardous substances, notify the inventories and prepare for emergencies by planning, notification and reporting. It is stated that for chlorine, if the quantity is above 100 pounds, one has to comply with this Act and there are about 350 chemicals in the list.
Off-Site Emergency Response Plans: A Preparedness Tool - Fire Engineering

LINK

Robospiders to monitor chemical accidents

German scientists developed robospiders to monitor chemical activities at the accident site to help what is going on after an accident, check the toxicity levels, caution the engineers to save lives and minimize property damage. It is stated that the thin robospiders when become thick come to life and perform their job. I am unable to understand the mechanism of its working, but the concept is new to me. This appears to be better than remote cameras and sensors as these work from fixed areas whereas robospiders that crawl can detect the condition at the desired place. Details of communication between the robospider and the control room and effectiveness of communication and safety in hazardous environment are not available in the article.
http://au.ibtimes.com/articles/245698/20111109/robospider-created-monitor-chemical-accidents.htm

Nov 8, 2011

About safety leadership - Pull the String

The link below is about safety leadership, the required qualities of a safety professional.
Pull the String

LINK

Nov 4, 2011

Another acid related incident

Few days ago, I posted an article about use of acid thinking it as water that caused acid burns. Yesterday, another incident of similar nature was reported in news papers wherein acid was applied to a pregnant lady assuming it as antiseptic in an hospital. It is reported that the baby boy delivered later died though there are conflicting reports about the time of death.
Hospital authorities suspect that
In the same news paper report, it is stated that earlier in other hospitals, i) a nurse used acid for swabbing instead of spirit before administering an injection and ii) a patient was given acid mistakenly instead of water.

It is observed that sometimes, incidents of similar nature like trapping of kids in the open bore wells, attack by jilted lovers on the girls, acid burns, etc are reported one after another in a short time. I don't know the reason for such pattern, whether such incidents occur regularly, but are reported during a period or incidents occur only during certain period
http://timesofindia.indiatimes.com/city/kolkata-/Acid-used-for-delivery-at-state-hospital-baby-dies/articleshow/10600194.cms

Nov 2, 2011

Flirting With Disaster - some notes

http://www.flirtingwithdisaster.net/
FLIRTING WITH DISASTER
Why accidents are rarely accidental?
Marc Gerstein with Michael Ellsberg

The book is about case studies, root causes and lessons to be drawn. Following is an extract of the book useful in implementing safety at workplace.
  1. Organizations do not learn routinely and systematically from past errors and disasters – in fact, they rarely ever do.
  2. Deliberate decision of not to try to learn from accidents is an anti-learning mechanism. This is because of blame and punishment/penalty one gets after identification of mistakes outweigh the benefits of understanding what should be done within the organization to avoid such mistakes.
  3. There is strong and successful resistance within many organizations to studying or recording past actions leading to catastrophe-because doing so would reveal errors, lies, or even crimes.
  4. Many accidents are not accidents at all. They were imagined and accurately predicted. But, the alarms were ignored by those who had the power to disregard them. It is hard to grasp the scale of suffering such mistakes can create.
  5. Some saw the warning signals, but they were not voiced in such a way, or to the relevant people to galvanize them into action. Such phenomenon is called ‘bystander behavior’.
  6. Organizational bystanders are individuals who fail to take action even when important threats or opportunities arise. They often have crucial information or a valuable point of view that would improve an organization’s decision-making, but for a variety of psychological and institutional reasons, they do not intervene.
  7. Observers are not likely to act if “better-qualified” authorities or experts are present nearby.
  8. Bystander behavior is more likely to occur in organizations with strong hierarchies and rigid group boundaries that are populated with leaders lacking the ability to value, foster, and manage dissent. Such organizations are also more likely to be staffed by midlevel managers who lack the motivation or skill to elicit views that differ from those of their bosses. When those in the middle suspect that things are amiss, they tend to ignore their own concerns. Instead, they defer to others in authority, justifying their inaction by attributing greater knowledge and wisdom to their superiors.
  9. Short term thinking about money is a factor in many accidents.
  10. Dangers arise when regulators and watchdog agencies develop financial and political ties to the entities they are supposed to be regulating and watching.
  11. Regulators are morally culpable when they do not take action.
  12. Collapse of firms were result of the corrosive effects of envy, greed and divided loyalties, combined with the deeper issue of organizational culture and its role in the fostering of disaster. The consequences are severe when watchdogs become consultants to the firms.
  13. Many solutions to risk reduction involve going against the beliefs and biases. When ignored, most risks do not somehow take care of themselves, or simply cease to be an issue.
  14. Each uncorrected risk is one more accident waiting to happen.
  15. Truth will not come out in organizations which punish the offenders. Accident investigation is a fact finding mission not a fault finding mission.
  16. Many of the disasters including natural disasters are preventable. In all cases, the severity can be reduced by better planning; hard work and a mind open to the nature of risk. The question is whether we have the wisdom and the will to change.
  17. Risk versus uncertainty: Risk is associated with something going badly wrong, whereas uncertainty involves outcomes of any kind.
  18. Unknown probabilities are riskier.
  19. People see greater risk when making decisions about which they feel comparatively ignorant (sometimes, it is the other way).  The more we know the less respect we give.
  20. In the modern world, many of the unfortunate outcomes occur to other people, not to the decision-maker and his kin.
  21. Being RISK BLIND underlies most tragedies. Knowledge should be available and to be understood by the decision makers.
  22. Most of the times, technology is not always well behaved. Innovators do not fully understand the behavior of the systems they create.
  23. Emerging technology has not had the time to accumulate a substantial body of experience through use under varied conditions (i.e. not fully tested before put into practice).
  24. Engineering personnel might have a hunch about a particular risk but lack culturally acceptable proof that the risk is real. In such a situation, the organization can behave as if conditions are safe until the hunch can be verified as a real risk through further testing or a real-life accident. Conversely, the organization can assume that conditions are risky until it can be proved safe.
  25. Wishful thinking: Thinking the way it pleases us.
  26. We see what we expected to see, not what was actually there.
  27. Under pressure, people often see what they want to see, especially if their push the company and subordinates in a particular direction.
  28. Causes for an accident: i) cold causes: unintentional mistakes, although not unimportant; ii) warm causes: include ignoring weak signals of danger and other bureaucratic inefficiencies in response to indications of risk – these choices appear less innocent than many design errors because they involve decision-makers’ priorities and judgment in the face of explicit, identified risks; iii) hot causes: deliberately subordinating safety to financial and political pressure – unethical and immoral decisions – often consist of conscious decisions that may well expose people to harm without their knowledge, and certainly without their consent.
  29. Design errors are central to many accidents – not visible till tragedy strikes.
  30. Design weaknesses often fall into two categories: the obvious and the subtle.
  31. Faulty design creates latent unsafe conditions that can result in an accident under particular circumstances.
  32. Design issues are the responsibility of the management, not of the workers.
  33. People are tempted by short-term gains or coerced by social pressure, and then their risky behavior is strongly reinforced when they repeatedly get away without incident. People develop comfort with deviations that did not cause any accident/wrong behavior and forget to be afraid.
  34. Inability to eliminate recurring warning signals shows system failure.
  35. Eleventh hour meetings are generally ineffective environments in which unpopular theories with little evidence are not considered/given due weightage.
  36. Ignoring weak signals is the norm in many organizations; it occurs in business and public policy as well as in science.
  37. It is easy to find causes for an accident after it occurs, but one should find before accident occurs.
  38. Progress inevitably engenders risk.
  39. Improving safety also encourages risk taking.
  40. People rely on instrumented systems assuming they function as per design intention but the actual behavior of these instruments can vary depending upon their installation, they do not behave as expected leading to difficulties which are severe during emergencies.
  41. If responses take decades but hazards take far longer to develop, all is well. If the relationship is reversed - as was the cause during floods, then things may end in disaster.
  42. In many organizations, decisions have to be approved by higher-ups, a process that inevitably slows things down.
  43. Redundancy is often the key to risk protection.
  44. Complex systems introduce unknown failure scenarios (KISS – keep it stupid simple?).
  45. Many accidents can be traced to various faults with monitoring and control systems, information overload to the operator, inadequate training.
  46. The shift to software-intensive systems has made man-machine partnerships far more complex than we fully understand. Highly reliable technology makes people less vigilant, since human beings are not effective monitors of situations that rarely fail. Employing more comprehensive and reliable systems only exacerbates the problem. Although such systems are more reliable, they are more boring to monitor as well as more difficult to diagnose.
  47. Layers of protection include safety procedures; training programs; specialized hardware interlocks; monitors, alarms, and warnings; and various forms of containment systems.
  48. Catastrophes occur when defensive systems fail or deliberately disabled.
  49. Butterfly effect: The idea that small differences can lead to major consequences down the road and at a distance is often called the butterfly effect (small is big, monsters looking innocent).
  50. Energy conservation, would not only reduce dependence on imported oil, but it would also save consumers money and cut urban air pollution, acid rain, greenhouse gases, the production of radioactive wastes, trade deficits, and long-term defense costs of protecting oil installations.
  51. Many important dynamics take a long time to have a visible effect.
  52. Facing the choice between the short-term requirements versus the long-term needs is not an easy decision.
  53. Understanding how an organization recognizes the hazards it faces, as well as how it changes in response to those hazards, is essential to avoiding disaster.
  54. Culture consists of emergent organizational properties that cannot be separated from history, especially the actions taken by company leaders.
  55. Basic cultural assumptions are deep-level tenets that employees and members of organizations hold to be true, often without realizing it. Over time, decisions that may start out as opinions, personal preferences, or practical necessities evolve into internalized truths that become second nature throughout the organization. Organizational members who “think the unthinkable” find themselves fighting a war on two fronts: the need to prove their case, and the need to establish the legitimacy of the arguments on which their case is based.
  56. Easter Island: Easter Island, the most remote inhabited place on the earth, located in the South Pacific Ocean, not ideal for new inhabitants because of the conditions, but contains giant stone statutes. Read the story of how cultural change brought self destruction in the book (http://flirtingwithdisaster.net/easter-island_321.html).
  57. Organizational tunnel vision: People within organizations obsessed with maximizing a single metric are especially prone to being blind to other considerations. In order to keep a schedule, engineers with safety concern have to prove that their concern is valid and the scheduled activity is unsafe rather than to prove that it was safe (Program engineers may ask the safety person, ‘show me how it is unsafe’ instead of program engineers analyzing the concern and proving to the safety engineer that it is safe).
  58. Tsunami December 2004: A school girl Tilly Smith on vacation on Maikhao Beach, Thailand noticing frothing and rapid receding of ocean waters alerted her mother, as her teacher told such phenomena as signs of an impending tsunami. Her action led to saving lives of all persons in the beach. (Tsunami waves can travel at 500 miles per hour across the deep ocean).
  1. Rules for preventing and coping with accidents:
    1. Rule # 1: Understand the risks you face. Evaluate the hazards every time you face. Probabilities don’t matter once any event with serious consequence like tsunami occurs. Whatever be the probability, in the words of Trevor Kletz, “we have done this way 100 times is not acceptable unless an accident on 101st time is acceptable”. Take action assuming the probability is 100% all the time.
    2. Rule # 2: Avoid being denial. Do not neglect warning signs or ignore assuming they are silly.
    3. Rule # 3: Pay attention to weak signals and early warnings. These are a telegraph warning of possible danger. Accidents don’t just happen and are often not accidental at all. Do not take it as one time affair. Because there is a problem, something is lacking, the incident occurred. Ignorance will only lead to a serious incident next time. Ignoring weak signals is a pervasive temptation you must learn to overcome.
    4. Rule # 4: It is essential not to subordinate the chance to avoid catastrophe to other considerations. Catching plane does not mean you should drive fast on the road. Missing the plane is worthy than injuring yourself or the person on the road and miss the plane anyway.
    5. Rule # 5: Do not delay by waiting for absolute proof or permission to act. The signal may not be true and you may become laughing stock if the signal doesn’t turn out to be true. But don’t get disheartened. It is better than allowing damage / loss of lives if the warning sign turns out to be true.
(Intelligence wings issue alerts many times to the government and citizens about terrorist attacks or of similar nature and many times we do not see attacks. This does not mean that we should not believe in those alerts. It is not possible to understand complex minds of people when even we do not know what we want. Then it is much more difficult to understand the nature. It is easy to blame safety and security officials for being overcautious but you are the first person to blame them when incidents occur without realizing that you are responsible for your safety. If you do not know what to do in your house or does not know what is happening in your backyard, who are you to question others?)
  1. Don’t squander your early warnings with delays or half measures. If you do, don’t be surprised if the clock runs out.
  2. Treat near misses as genuine accidents: It is a safety sine qua non that near misses and other forms of weak signals be treated as if they were genuine accidents. They are considered “free tuition” – valuable lessons without much cost. Always pay attention as if the worst had actually occurred, but develop efficient ways of confirming or disconfirming the actual danger to minimize your time and effort.
  3. In many accidents, the bulk of the damage occurs in the aftermath, not during the event. A tremendous amount of harm can be reduced by early warning systems, defense construction, contingency planning, and rapid response. Even when the incident can’t be prevented, as is often the case in natural disasters like the tsunami, anticipation can often mitigate a lot of harm.
  4. Politics trumps safety. Here politics means one-upmanship and resultant timelines, pressures, communication or lack of it, and so on.
  5. Routine and non-routine accidents: We do not see a hazard until we experience the consequence. Many accidents occur routinely because people are irrational about danger. People are scared about non-routine accidents like anthrax poisoning, nuclear accident, flu epidemics, etc but not about routine accidents like slips, falls, road accidents, deaths from smoking/alcohol consumption which more than non-routine accidents. People overact to rare risks than common accidents.
  6. In some cases like living near ocean/volcanic or seismic zone/mountains, etc we may feel that we have no choice but to accept risk, but flirting with disaster out of ignorance or denial rather than rational choice is simply foolish.
  7. Residential fires related cooking: Home cooking is responsible for starting over a quarter of the 400,000 residential fires that cause 13,000 injuries and 3,000 deaths in the United States each year. Smoke alarms, fire blankets, and fire extinguishers as well as safe practices for deep-fat frying and other high-risk activities are sensible precautions even if they are not perfect solutions. (Last few years, we are seeing a number of fires and explosions due to rupture of gas piping in residential areas. The common reasons are, digging without authority, not closing the valve properly, corrosion, poor maintenance and monitoring, etc. Still, thousands of miles of gas lines are being laid every year and we are living with them.)
  8. The enemies of effectively dealing with low-probability risks are denial, ignorance, and lack of preparation. Denial prevents our dealing with the risks in the first place (not recognizing the hazard); ignorance constrains our choices and distorts our priorities; and lack of preparation forces us to deal with complex problems under emotional pressure and time constraints, vastly increasing the chances of bad judgment and the possibility that we well be overtaken by events. Examine the cumulative risk of all low-probability threats and make your plans according to the rule of avoiding the greater mistake. You may not always make the same choice for each risk, or the same choices as other people, but they will be your choices, made with knowledge and forethought.
  9. The consequence of minor risks will be high. A simple event will grow into a monster when we are not prepared.
  10. Moving from BYSTANDER to WITNESS to WHISTLE-BLOWER: You may not be able to question the defaulters at all times. Sometimes just “active watching”, visibly taking notes, or writing a concerned e-mail is enough to change the course of a situation. Being visible and questioning clearly inappropriate actions rather than fading into the background often makes a difference, even if it is not a decisive action. Equally important, when someone else takes stand-up action, lending visible support matters a great deal. Individual effort may not be effective, but team effort will make the wrongdoers to change their ways. Silent watching or cooperating with wrongdoers will lead to destruction of the society and the individual also, while taking action / making right noises will make the person confident, satisfied and help the society.
  11. Suggestions for Professionals and Managers:
    1. We should not be bystanders and should not encourage bystander behavior in those around us.
    2. We should all do what we can to ensure that dissent is encouraged, not repressed, and that the channels of complaint are open.
    3. We should do what we can to build viable information and reporting systems that widely disseminate risk-related performance information. According to research, when people’s actions go unrecorded, and are therefore undetectable, the chances of shortcuts under pressure rise by a factor of TEN.
    4. We should not collude in cover-ups, even minor ones. Such cover-ups may lead to increased difficulty when it becomes necessary to reveal embarrassing facts later on. Every incident should not be covered as acceptable risk.
    5. When there is a likely and recordable unacknowledged risk, each of us should assemble our allies and pursue a complaint with the appropriate institutional body. If all else fails, we should consider blowing the whistle (with documents). Most of us are prisoners of institutional realities that tolerate unacceptable risk in the name of practicality. The fallacy in most organizations is that lowering risks is unacceptably expensive. In fact, not only is it probably much less expensive than people think, over the long term it will probably save money as well as lives.
  12. Suggestions for Leaders:
    1. Realize that practicalities and shortcuts have costs that inevitably even out in time and that one’s choice is to either pay now or pay later. May be your policies will not immediately lead to accidents in your tenure and you get all appreciation for the gains that are short-term, but the organization suffers later when those of your policies lead to catastrophes in the long run.
    2. We can’t put a price tag to injuries and deaths and compensation alone is not sufficient to judge the cost.
    3. Leadership is often the originator of the financial, scheduling, or political pressures, and thus is the ultimate source of a significant increase in risk. Imposing nonnegotiable performance objectives combined with severe sanctions for failure encourages the violation of safety rules, reporting distortions, and dangerous shortcuts. Putting people in no-win performance situations encourages recklessness and fraud, inevitably increasing the chances of a major catastrophe. Leaders must therefore hold themselves accountable for the inadvertent consequences of their management philosophy and tactics.
    4. Pay scrupulous attention to design. When design is faulty, accidents happen. In organizational settings, accidents are never accidental: They are inevitably the result of faulty management, particularly the management of safety.
    5. Systemize paying attention to near misses, weak signals, and assessments of engineers and safety officials. Leaders have to create monitoring systems, systematic review procedures, and independent information channels that do not report through the operational chain of command. While safety and risk management is perfectly compatible with efficient operations over the long term, if often runs contrary to it in the short term, especially if there have been long periods of neglect.
    6. Recognize that while every organization tolerates some dissent, on certain subjects it does not. Only leaders can eliminate these “undiscussables”. Encourage whistle blowers to get timely information about risks, else bystander behavior is inevitable and affect the organization in the long run.
    7. Create effective contingency plans for serious but low-probability risks.
    8. Every organization requires robust, independent watchdogs. There is no substitute for regulatory independence and should not be measured in terms of cost of maintaining it.
    9. Leadership must subject itself to relentless review and self-criticism.
  13. Relabeling problems as opportunities can have true shift in mental framework and reap benefits to the organization.
  14. The first big mental shift is accepting the inevitability of accidents and catastrophes without giving in to them. Do not wait until after a disaster strikes.
  15. The second big mental shift is appreciating the difference between new ideas and unpracticed old ones. 

Featured Post

Reduced my weight from 96 to 76 kg and tummy from 38-40 to 34-35 inches in about 9 months

I am working in the safety department of a government organization. As a part of the job, I used to go around and interact with person...