We are nothing after our death. Let us donate our body organs for the poor.

Be not afraid of anything. You will do marvelous work The moment you fear, you are nobody - Swamy Vivekananda

If you think safety is expensive, try an accident... - O.P.Kharbanda

Preventable accidents, if they are not prevented due to our negligence, it is nothing short of a murder - Dr. Sarvepalli Radha Krishna, 2nd President of India

Zero accidents through zero unsafe behaviors. Do not be complacent that there are no accidents. There may be near miss accidents (NMAs). With luck/chance, somebody escaped knowingly or unknown to the person. But, we can't be safe, if we depend upon the luck.

Safety culture is how the organization behaves when no one is watching.

We make No compromise with respect to Morality, Ethics, or Safety. If a design or work practice is perceived to be unsafe, we do not proceed until the issue is resolved. - Mission statement by S&B Engineers & Consultants Ltd. http://www.sbec.com/safety/

Human meat gets least priority - A doctor's comment on accidents

CSB video excerpts from Dr.Trevor Kletz, http://www.youtube.com/watch?v=XQn5fL62KL8

Showing posts with label regulator. Show all posts
Showing posts with label regulator. Show all posts

Dec 30, 2018

Some thoughts on fall of safety standards - forgetfulness, over-confidence, complacency, arrogance

Anybody or any organization, immediately after an event not to like to happen will have several commitments to itself. I/we will do this or that, now on wards.
Even a student also makes such resolutions in or after an examination, which he did not do better.
But, in most of the cases, nothing will be done.
Some will start, but, the steam is lost.
Very few will sustain to a great extent. Even in such persons/organization will make exemptions, here and there.

This week, I attended a meet organized by a regulator. It was told in the opening remarks that regulators have to do the act of listening, observe the things and make informed decisions. This is applicable even to all those who are in the job of monitoring / advising and they need not be necessarily the designated regulators.

In case of individuals also, we are our own regulators and doers. As a proverb goes, we have one mouth, two ears and two eyes, indicating see and hear more, but talk less.

Coming back to the title, I observed that even in those entities who have developed highest standards, became role models and even are in the business of training, events have taken place for simple reasons which can be avoided / addressed totally before the occurrence of the event. Events in these places make me to think that whether they are wrong in the lessons learnt i.e. whether the lessons learnt for implementation are impracticable.

In many post accident recommendations, training and supervision aspects are listed routinely. Even if the training is good, absorption of the knowledge and its use in the day-to-day affairs is highly doubtful factor. Even those who understood the information too do not implement inspite of knowing that this may lead to incidents.

Like in financial sector, as I read somewhere, risk and returns are proportional. Low risk investments yield less returns and so high risk investments generate huge returns. I had my own experience in share market investment at two different periods and on both occasions, I had the opportunity (!) to learn the lessons to come out with some loss.

I think, it is in human behavior to take risk when it comes our activities, let it be on the roads or work place. And, it is the other way when it comes to our home.

I commute to my work place which is at 13 km from my house, by bike and these thoughts do occur generally, but I am yet to get any reason.

In nuclear field, ALARA (as low as reasonably achievable) concept is preached and is genrally followed to limit radiation dose to occupational workers and public. Whereas in industrial safety, it is ALARP (as low as reasonably practicable) implying that we have to be practicable in taking measures to ensure safety and health of the persons. This also implies (to me) that certain value is attached to human lives for dispensation. If an accident takes place leading to injuries / fatalities, generally we read declaration of compensation. Though enquiry takes place, hardly there are cases where the wrong doers i.e. those responsible for the occurrence of accidents are punished. Every year, lakhs of persons are dying in accidents (on the roads / work place), equal or similar number of persons responsible for these are not punished and it shows that human life got only some number and things will proceed as usual as if nothing happened. Reasons for recurrence are generally as in title of this post. This makes the job of the safety advisers / regulators, a difficult task.

If some disturbance takes place in an inhabited place, police will be blamed for not doing the job. If some emissions take place in a factory, pollution control authorities will be blamed that they are sleeping or accepted something. If some accident takes place, safety adviser will be blamed internally and factory inspectors externally. And so on.
But, little or no action is taken to identify the wrong doers and punish them.

Authorities for control of activities have a difficult task of facing pressures from all around and at the same time, not to succumb to these pressures. They have to be like stones not having any emotions to keep their health and family not to get affected by workplace pressures. Some will resign the post to look for different type of jobs or bade good bye forever (if they are financially resourceful to lead the rest of the life).

Any management would like to have highest productivity at minimal cost. Availability of qualified human resources is an issue face by many. Most of the educational institutions have become factories to churn out the students with good grades / marks but many of them fail miserably in getting suitable employment commensurate with the certificate as they can't answer the questions of the interviewers. Hence, they settle for jobs whatever they can get. They may get satisfied or may not. This can lead to dissatisfaction and doing the job for the sake of earning money only. It requires lot of efforts from the side of managements too train such people and I am not sure how many organizations have the policy of training their manpower before engaging them and then even doubtful case of retraining at regular intervals or change of job / modifications in workplace.

With these things, may be there is some optimal safe production i.e. under present set of conditions, it can deliver X numbers of their products. If we try to stretch production on higher side without commensurate additions / improvements, accidents / failures do take place, if not immediately, but definitely at a later date. This is because, it requires some time, even for the established systems to degrade. Now, if there is a change of management, persons at top can claim for success even at higher production in the beginning (higher production is initially possible because degradation also needs some time), but the person coming later can't deliver the same, because degradation process starts accelerating.

Hence, those with safety in mind have to conclude / decide what is their optimum safe production capacity and stick to it. Else, what we are seeing or reading in papers will continue to happen.





Aug 13, 2015

Accidents, injuries, loss of property, some causes

As per a report, many accidents are happening in China due to poor safety conditions, lack of implementation of regulations and inspections, lack of training, etc.
If enforcement is not strengthened as per the needs of growing economy, it becomes very difficult for regulators to monitor the activities of lakhs of industrial units. The situation may be such that it becomes difficult even to inspect atleast once in lifetime because of huge number of factories, and dismal number of inspectors.


Dec 28, 2014

Safety - who is responsible - Failures attributed to regulators

When there is a robbery, police are blamed.
When there is violence, police are blamed.
When there is train accident, railways are blamed.
When a bank shuts the shop, government and police are blamed.
When some company that offers quick returns and later closes, again as above.
When there is an aeroplane accident, airlines and regulators are blamed.
When there is pollution, factory and pollution control board are blamed.
When there is an accident, factory and factories department are blamed.

Like this, blame game goes on.

In many of the above events, I do not understand what police or regulators can do. With such meager manpower, no regulator can stop a situation.

If we are not careful about ourselves, our property, what a police can do? He will make rounds, but he can't present everywhere, every time. Similarly, if someone starts violence, the local if do not catch him, why to blame others.

No doubt, police are there to prevent and control situations, but it is not possible for them with limited man power to ensure 100% peace without cooperation from people.

Similarly, we invest money out of greediness for unrealistic returns and later cry on others.
If there is pollution, people should complain to police and regulators about the problem. Also, they can form groups and pressurize the factory management to correct or close.

It is the responsibility of the management to ensure that safe working conditions and environment for workers and public are ensured. If something goes wrong, there should be fast track courts to settle the cases within a day for quick redressal of the issues. These courts, depending upon the nature and extent of violation, can order the company to pay or close the shop. There is no need to give time for correction as it is their basic duty for safety and environment.

Managements compromise on many issues for profit. Many promise and agree to comply with all legislations, only to violate later on several rules and regulations. Such companies need not be shown any mercy, for lack of knowledge or unintentional violations, as it is their responsibility to know everything, before starting operations.



Jun 4, 2014

Bullying safety officer is like hitting under your own pants

Those who do not like safety officers to inspect their areas for identifying hazardous conditions and also pick up arguments to make/scare them not to visit their works areas will not help to improve the situation. It will only spiral the hazardous conditions and can lead to accidents on a large scale at a later date.
Clearly, there are provisions in related regulations to penalize and also punish the works managers for violations. It is a matter of time before the law catches up with the violators.

Unless these works managers realize the importance of safety and learn the punishments that await the (willful) violators, there is no light for the safety of the persons working in  work areas managed by such persons who are no different from murderers.

Injuries / deaths to employees in those work areas can be compared to planned murders/attempts to cause injuries as works managers neither take action to prevent unsafe conditions not help the safety professionals to identify and suggest safety measures.

Unfortunately, safety officers are not vested with powers like that of regulators and thus will be at the receiving end unless the in-charge of the facility gives unconditional support. Else, at some time or other, the safety department officials either will keep quiet leaving the matters to the fate or will look for better opportunities in other companies. Either way, it is the loss to the company having such safety culture.

Nov 2, 2011

Flirting With Disaster - some notes

http://www.flirtingwithdisaster.net/
FLIRTING WITH DISASTER
Why accidents are rarely accidental?
Marc Gerstein with Michael Ellsberg

The book is about case studies, root causes and lessons to be drawn. Following is an extract of the book useful in implementing safety at workplace.
  1. Organizations do not learn routinely and systematically from past errors and disasters – in fact, they rarely ever do.
  2. Deliberate decision of not to try to learn from accidents is an anti-learning mechanism. This is because of blame and punishment/penalty one gets after identification of mistakes outweigh the benefits of understanding what should be done within the organization to avoid such mistakes.
  3. There is strong and successful resistance within many organizations to studying or recording past actions leading to catastrophe-because doing so would reveal errors, lies, or even crimes.
  4. Many accidents are not accidents at all. They were imagined and accurately predicted. But, the alarms were ignored by those who had the power to disregard them. It is hard to grasp the scale of suffering such mistakes can create.
  5. Some saw the warning signals, but they were not voiced in such a way, or to the relevant people to galvanize them into action. Such phenomenon is called ‘bystander behavior’.
  6. Organizational bystanders are individuals who fail to take action even when important threats or opportunities arise. They often have crucial information or a valuable point of view that would improve an organization’s decision-making, but for a variety of psychological and institutional reasons, they do not intervene.
  7. Observers are not likely to act if “better-qualified” authorities or experts are present nearby.
  8. Bystander behavior is more likely to occur in organizations with strong hierarchies and rigid group boundaries that are populated with leaders lacking the ability to value, foster, and manage dissent. Such organizations are also more likely to be staffed by midlevel managers who lack the motivation or skill to elicit views that differ from those of their bosses. When those in the middle suspect that things are amiss, they tend to ignore their own concerns. Instead, they defer to others in authority, justifying their inaction by attributing greater knowledge and wisdom to their superiors.
  9. Short term thinking about money is a factor in many accidents.
  10. Dangers arise when regulators and watchdog agencies develop financial and political ties to the entities they are supposed to be regulating and watching.
  11. Regulators are morally culpable when they do not take action.
  12. Collapse of firms were result of the corrosive effects of envy, greed and divided loyalties, combined with the deeper issue of organizational culture and its role in the fostering of disaster. The consequences are severe when watchdogs become consultants to the firms.
  13. Many solutions to risk reduction involve going against the beliefs and biases. When ignored, most risks do not somehow take care of themselves, or simply cease to be an issue.
  14. Each uncorrected risk is one more accident waiting to happen.
  15. Truth will not come out in organizations which punish the offenders. Accident investigation is a fact finding mission not a fault finding mission.
  16. Many of the disasters including natural disasters are preventable. In all cases, the severity can be reduced by better planning; hard work and a mind open to the nature of risk. The question is whether we have the wisdom and the will to change.
  17. Risk versus uncertainty: Risk is associated with something going badly wrong, whereas uncertainty involves outcomes of any kind.
  18. Unknown probabilities are riskier.
  19. People see greater risk when making decisions about which they feel comparatively ignorant (sometimes, it is the other way).  The more we know the less respect we give.
  20. In the modern world, many of the unfortunate outcomes occur to other people, not to the decision-maker and his kin.
  21. Being RISK BLIND underlies most tragedies. Knowledge should be available and to be understood by the decision makers.
  22. Most of the times, technology is not always well behaved. Innovators do not fully understand the behavior of the systems they create.
  23. Emerging technology has not had the time to accumulate a substantial body of experience through use under varied conditions (i.e. not fully tested before put into practice).
  24. Engineering personnel might have a hunch about a particular risk but lack culturally acceptable proof that the risk is real. In such a situation, the organization can behave as if conditions are safe until the hunch can be verified as a real risk through further testing or a real-life accident. Conversely, the organization can assume that conditions are risky until it can be proved safe.
  25. Wishful thinking: Thinking the way it pleases us.
  26. We see what we expected to see, not what was actually there.
  27. Under pressure, people often see what they want to see, especially if their push the company and subordinates in a particular direction.
  28. Causes for an accident: i) cold causes: unintentional mistakes, although not unimportant; ii) warm causes: include ignoring weak signals of danger and other bureaucratic inefficiencies in response to indications of risk – these choices appear less innocent than many design errors because they involve decision-makers’ priorities and judgment in the face of explicit, identified risks; iii) hot causes: deliberately subordinating safety to financial and political pressure – unethical and immoral decisions – often consist of conscious decisions that may well expose people to harm without their knowledge, and certainly without their consent.
  29. Design errors are central to many accidents – not visible till tragedy strikes.
  30. Design weaknesses often fall into two categories: the obvious and the subtle.
  31. Faulty design creates latent unsafe conditions that can result in an accident under particular circumstances.
  32. Design issues are the responsibility of the management, not of the workers.
  33. People are tempted by short-term gains or coerced by social pressure, and then their risky behavior is strongly reinforced when they repeatedly get away without incident. People develop comfort with deviations that did not cause any accident/wrong behavior and forget to be afraid.
  34. Inability to eliminate recurring warning signals shows system failure.
  35. Eleventh hour meetings are generally ineffective environments in which unpopular theories with little evidence are not considered/given due weightage.
  36. Ignoring weak signals is the norm in many organizations; it occurs in business and public policy as well as in science.
  37. It is easy to find causes for an accident after it occurs, but one should find before accident occurs.
  38. Progress inevitably engenders risk.
  39. Improving safety also encourages risk taking.
  40. People rely on instrumented systems assuming they function as per design intention but the actual behavior of these instruments can vary depending upon their installation, they do not behave as expected leading to difficulties which are severe during emergencies.
  41. If responses take decades but hazards take far longer to develop, all is well. If the relationship is reversed - as was the cause during floods, then things may end in disaster.
  42. In many organizations, decisions have to be approved by higher-ups, a process that inevitably slows things down.
  43. Redundancy is often the key to risk protection.
  44. Complex systems introduce unknown failure scenarios (KISS – keep it stupid simple?).
  45. Many accidents can be traced to various faults with monitoring and control systems, information overload to the operator, inadequate training.
  46. The shift to software-intensive systems has made man-machine partnerships far more complex than we fully understand. Highly reliable technology makes people less vigilant, since human beings are not effective monitors of situations that rarely fail. Employing more comprehensive and reliable systems only exacerbates the problem. Although such systems are more reliable, they are more boring to monitor as well as more difficult to diagnose.
  47. Layers of protection include safety procedures; training programs; specialized hardware interlocks; monitors, alarms, and warnings; and various forms of containment systems.
  48. Catastrophes occur when defensive systems fail or deliberately disabled.
  49. Butterfly effect: The idea that small differences can lead to major consequences down the road and at a distance is often called the butterfly effect (small is big, monsters looking innocent).
  50. Energy conservation, would not only reduce dependence on imported oil, but it would also save consumers money and cut urban air pollution, acid rain, greenhouse gases, the production of radioactive wastes, trade deficits, and long-term defense costs of protecting oil installations.
  51. Many important dynamics take a long time to have a visible effect.
  52. Facing the choice between the short-term requirements versus the long-term needs is not an easy decision.
  53. Understanding how an organization recognizes the hazards it faces, as well as how it changes in response to those hazards, is essential to avoiding disaster.
  54. Culture consists of emergent organizational properties that cannot be separated from history, especially the actions taken by company leaders.
  55. Basic cultural assumptions are deep-level tenets that employees and members of organizations hold to be true, often without realizing it. Over time, decisions that may start out as opinions, personal preferences, or practical necessities evolve into internalized truths that become second nature throughout the organization. Organizational members who “think the unthinkable” find themselves fighting a war on two fronts: the need to prove their case, and the need to establish the legitimacy of the arguments on which their case is based.
  56. Easter Island: Easter Island, the most remote inhabited place on the earth, located in the South Pacific Ocean, not ideal for new inhabitants because of the conditions, but contains giant stone statutes. Read the story of how cultural change brought self destruction in the book (http://flirtingwithdisaster.net/easter-island_321.html).
  57. Organizational tunnel vision: People within organizations obsessed with maximizing a single metric are especially prone to being blind to other considerations. In order to keep a schedule, engineers with safety concern have to prove that their concern is valid and the scheduled activity is unsafe rather than to prove that it was safe (Program engineers may ask the safety person, ‘show me how it is unsafe’ instead of program engineers analyzing the concern and proving to the safety engineer that it is safe).
  58. Tsunami December 2004: A school girl Tilly Smith on vacation on Maikhao Beach, Thailand noticing frothing and rapid receding of ocean waters alerted her mother, as her teacher told such phenomena as signs of an impending tsunami. Her action led to saving lives of all persons in the beach. (Tsunami waves can travel at 500 miles per hour across the deep ocean).
  1. Rules for preventing and coping with accidents:
    1. Rule # 1: Understand the risks you face. Evaluate the hazards every time you face. Probabilities don’t matter once any event with serious consequence like tsunami occurs. Whatever be the probability, in the words of Trevor Kletz, “we have done this way 100 times is not acceptable unless an accident on 101st time is acceptable”. Take action assuming the probability is 100% all the time.
    2. Rule # 2: Avoid being denial. Do not neglect warning signs or ignore assuming they are silly.
    3. Rule # 3: Pay attention to weak signals and early warnings. These are a telegraph warning of possible danger. Accidents don’t just happen and are often not accidental at all. Do not take it as one time affair. Because there is a problem, something is lacking, the incident occurred. Ignorance will only lead to a serious incident next time. Ignoring weak signals is a pervasive temptation you must learn to overcome.
    4. Rule # 4: It is essential not to subordinate the chance to avoid catastrophe to other considerations. Catching plane does not mean you should drive fast on the road. Missing the plane is worthy than injuring yourself or the person on the road and miss the plane anyway.
    5. Rule # 5: Do not delay by waiting for absolute proof or permission to act. The signal may not be true and you may become laughing stock if the signal doesn’t turn out to be true. But don’t get disheartened. It is better than allowing damage / loss of lives if the warning sign turns out to be true.
(Intelligence wings issue alerts many times to the government and citizens about terrorist attacks or of similar nature and many times we do not see attacks. This does not mean that we should not believe in those alerts. It is not possible to understand complex minds of people when even we do not know what we want. Then it is much more difficult to understand the nature. It is easy to blame safety and security officials for being overcautious but you are the first person to blame them when incidents occur without realizing that you are responsible for your safety. If you do not know what to do in your house or does not know what is happening in your backyard, who are you to question others?)
  1. Don’t squander your early warnings with delays or half measures. If you do, don’t be surprised if the clock runs out.
  2. Treat near misses as genuine accidents: It is a safety sine qua non that near misses and other forms of weak signals be treated as if they were genuine accidents. They are considered “free tuition” – valuable lessons without much cost. Always pay attention as if the worst had actually occurred, but develop efficient ways of confirming or disconfirming the actual danger to minimize your time and effort.
  3. In many accidents, the bulk of the damage occurs in the aftermath, not during the event. A tremendous amount of harm can be reduced by early warning systems, defense construction, contingency planning, and rapid response. Even when the incident can’t be prevented, as is often the case in natural disasters like the tsunami, anticipation can often mitigate a lot of harm.
  4. Politics trumps safety. Here politics means one-upmanship and resultant timelines, pressures, communication or lack of it, and so on.
  5. Routine and non-routine accidents: We do not see a hazard until we experience the consequence. Many accidents occur routinely because people are irrational about danger. People are scared about non-routine accidents like anthrax poisoning, nuclear accident, flu epidemics, etc but not about routine accidents like slips, falls, road accidents, deaths from smoking/alcohol consumption which more than non-routine accidents. People overact to rare risks than common accidents.
  6. In some cases like living near ocean/volcanic or seismic zone/mountains, etc we may feel that we have no choice but to accept risk, but flirting with disaster out of ignorance or denial rather than rational choice is simply foolish.
  7. Residential fires related cooking: Home cooking is responsible for starting over a quarter of the 400,000 residential fires that cause 13,000 injuries and 3,000 deaths in the United States each year. Smoke alarms, fire blankets, and fire extinguishers as well as safe practices for deep-fat frying and other high-risk activities are sensible precautions even if they are not perfect solutions. (Last few years, we are seeing a number of fires and explosions due to rupture of gas piping in residential areas. The common reasons are, digging without authority, not closing the valve properly, corrosion, poor maintenance and monitoring, etc. Still, thousands of miles of gas lines are being laid every year and we are living with them.)
  8. The enemies of effectively dealing with low-probability risks are denial, ignorance, and lack of preparation. Denial prevents our dealing with the risks in the first place (not recognizing the hazard); ignorance constrains our choices and distorts our priorities; and lack of preparation forces us to deal with complex problems under emotional pressure and time constraints, vastly increasing the chances of bad judgment and the possibility that we well be overtaken by events. Examine the cumulative risk of all low-probability threats and make your plans according to the rule of avoiding the greater mistake. You may not always make the same choice for each risk, or the same choices as other people, but they will be your choices, made with knowledge and forethought.
  9. The consequence of minor risks will be high. A simple event will grow into a monster when we are not prepared.
  10. Moving from BYSTANDER to WITNESS to WHISTLE-BLOWER: You may not be able to question the defaulters at all times. Sometimes just “active watching”, visibly taking notes, or writing a concerned e-mail is enough to change the course of a situation. Being visible and questioning clearly inappropriate actions rather than fading into the background often makes a difference, even if it is not a decisive action. Equally important, when someone else takes stand-up action, lending visible support matters a great deal. Individual effort may not be effective, but team effort will make the wrongdoers to change their ways. Silent watching or cooperating with wrongdoers will lead to destruction of the society and the individual also, while taking action / making right noises will make the person confident, satisfied and help the society.
  11. Suggestions for Professionals and Managers:
    1. We should not be bystanders and should not encourage bystander behavior in those around us.
    2. We should all do what we can to ensure that dissent is encouraged, not repressed, and that the channels of complaint are open.
    3. We should do what we can to build viable information and reporting systems that widely disseminate risk-related performance information. According to research, when people’s actions go unrecorded, and are therefore undetectable, the chances of shortcuts under pressure rise by a factor of TEN.
    4. We should not collude in cover-ups, even minor ones. Such cover-ups may lead to increased difficulty when it becomes necessary to reveal embarrassing facts later on. Every incident should not be covered as acceptable risk.
    5. When there is a likely and recordable unacknowledged risk, each of us should assemble our allies and pursue a complaint with the appropriate institutional body. If all else fails, we should consider blowing the whistle (with documents). Most of us are prisoners of institutional realities that tolerate unacceptable risk in the name of practicality. The fallacy in most organizations is that lowering risks is unacceptably expensive. In fact, not only is it probably much less expensive than people think, over the long term it will probably save money as well as lives.
  12. Suggestions for Leaders:
    1. Realize that practicalities and shortcuts have costs that inevitably even out in time and that one’s choice is to either pay now or pay later. May be your policies will not immediately lead to accidents in your tenure and you get all appreciation for the gains that are short-term, but the organization suffers later when those of your policies lead to catastrophes in the long run.
    2. We can’t put a price tag to injuries and deaths and compensation alone is not sufficient to judge the cost.
    3. Leadership is often the originator of the financial, scheduling, or political pressures, and thus is the ultimate source of a significant increase in risk. Imposing nonnegotiable performance objectives combined with severe sanctions for failure encourages the violation of safety rules, reporting distortions, and dangerous shortcuts. Putting people in no-win performance situations encourages recklessness and fraud, inevitably increasing the chances of a major catastrophe. Leaders must therefore hold themselves accountable for the inadvertent consequences of their management philosophy and tactics.
    4. Pay scrupulous attention to design. When design is faulty, accidents happen. In organizational settings, accidents are never accidental: They are inevitably the result of faulty management, particularly the management of safety.
    5. Systemize paying attention to near misses, weak signals, and assessments of engineers and safety officials. Leaders have to create monitoring systems, systematic review procedures, and independent information channels that do not report through the operational chain of command. While safety and risk management is perfectly compatible with efficient operations over the long term, if often runs contrary to it in the short term, especially if there have been long periods of neglect.
    6. Recognize that while every organization tolerates some dissent, on certain subjects it does not. Only leaders can eliminate these “undiscussables”. Encourage whistle blowers to get timely information about risks, else bystander behavior is inevitable and affect the organization in the long run.
    7. Create effective contingency plans for serious but low-probability risks.
    8. Every organization requires robust, independent watchdogs. There is no substitute for regulatory independence and should not be measured in terms of cost of maintaining it.
    9. Leadership must subject itself to relentless review and self-criticism.
  13. Relabeling problems as opportunities can have true shift in mental framework and reap benefits to the organization.
  14. The first big mental shift is accepting the inevitability of accidents and catastrophes without giving in to them. Do not wait until after a disaster strikes.
  15. The second big mental shift is appreciating the difference between new ideas and unpracticed old ones. 

Mar 28, 2010

Safety culture

Recently, I heard a person questioning about preaching of safety culture. He questioned about the credentials of the culture of the preacher saying that the person does not wear helmet while driving from home to office and he preaches about safety culture to others.

Yes, the person is wrong that he is not using helmet. But, by his act, he will be at risk of getting injured. But, in a plant, if the Manager can't inculcate safety culture, then entire staff of the concerned plant as well as those in the vicinity will be affected. By saying you are wrong will not make my wrong act a RIGHT. A wrong act is always a wrong act only. By telling the safety man that your acts outside are wrong and you can't tell me how I should do is not a correct statement at all. These types of arguments are like shadow boxing and will not help in improving safety at the plant.

Another example is a theft has occurred in the house when the occupant is not in the house. Suppose the neighbour informs about the same to the occupant. If occupant replies not bother about what is happening in his house and advises to look into his own house affairs, who will be the loser. Obviously, the occupant. Similarly, in a factory the plant manager will be loser and will be responsible for acts and deeds and may be charged by the regulator depending upon the seriousness of the violations. At that time, plant manager can't tell the regulator that regulator is not wearing shoe or not using the helmet. Before such a thing occurs, the internal safety officer monitors and advises the plant manager for suitable action so that plant manager can avoid embarrasing and difficult situations from his employees as well as the regulator.

Featured Post

Reduced my weight from 96 to 76 kg and tummy from 38-40 to 34-35 inches in about 9 months

I am working in the safety department of a government organization. As a part of the job, I used to go around and interact with person...