Jeremy M. Gernand, PhD, CSP, CRE Associate Professor of Environmental Health and Safety Engineering
John and Willie Leone Department of Energy and Mineral Engineering
College of Earth and Mineral Sciences
The Pennsylvania State University, University Park, PA, USA
The most significant Halloween safety risk for most participants is auto-pedestrian crashes. Large numbers of pedestrians present in places where they are not normally at night and wearing strange clothes that can impede their own and others’ visibility. These studies found a 43% higher risk on Halloween in the US: https://bit.ly/4ntKQ3W, and 34% higher in the UK: https://bit.ly/47dcJs6.
Wear bright visible costumes (they manufacture black reflective tape and fabric now, in case you don’t want to go looking like a safety worker), go in groups with a designated lookout, adult supervision is essential for small children, and don’t trick-or-treat under the influence.
Capturing carbon dioxide from pre-combustion natural gas (https://politi.co/47gbJSE) does not have the same co-benefits for reduced health-related effects from emissions as capturing CO2 from exhaust gasses. I’m not for the perfect being the enemy of the good, but maybe public investment should encourage more exhaust capture since this will have immediate health-related benefits and similar climate mitigation to this pre-combustion capture. This seems to be a carbon permit finance-related low-hanging-fruit-type project in CA.
In one of the most resounding safety victories of the past 40 years, firefighters spend less time fighting fires, and people and buildings are less likely to be lost to fire (50% fewer deaths according to Vox: https://bit.ly/4pKALBJ). There was no magic bullet. It was a long series of unexciting improvements in warning devices, building codes, regulations, reduced indoor smoking, and safer electrical systems. This isn’t automatic. It takes investment, effort, research, and good decision making, but when we agree on the objective, this kind of result is achievable.
New data on the air quality impacts of 3D printers in classrooms from UL shows these are of minimal concern, at least with proper ventilation in the school (https://bit.ly/4999W4j).Short lived increases in ultrafine particles and VOCs were observed. And since printers are unlikely to be operating continuously, the type of low temperature polymer 3D printers used in schools appear to be low risk. While not all studied printers have previously been so “clean”. 3D printing in general likely holds promise to significantly reduce exposures to operators from more traditional milling and welding processes as the technology continues to develop.
Tunisian violent protest over air quality and pollution concerns https://bit.ly/48yDXul. People have little control over the air they breathe. Wherever we live, we need some independent authority to manage emissions to the air whether that’s from major industries like this case, or from households burning fuel for heat and cooking. It can’t be “anything goes”. In this case, people express their anger at a long running issue that has not been resolved. There exist solutions to mitigate these emissions. It is only a matter of investment and attention.
This is an interesting air pollution visualization tool: https://bit.ly/47kfo32.I’m still trying to work out the details of how the ‘puffs’ of particulate matter fade/disperse. This only include major emitters, so many local sources, especially numerous ones like vehicle traffic and home heating, are not included. It is a useful way to visualize how pollution disperses across an area from a single origin point, but it is not a complete picture of exposure.
New paper links trichloroethylene (TCE) exposure to Parkinson’s disease (https://bit.ly/3L9rof6). TCE is a degreasing agent, has previously been used in dry cleaning and coffee decaffeination, and helps produce other chemicals. Using EPA national air toxics assessment has some uncertainty when applied to low concentration chemicals like TCE and come correlated exposure may in fact be more responsible, but identifying the locations to test this idea is a valuable step. The reported increase in risk from TCE exposure is about 10%. It is important to note that the reason we can identify effects of this magnitude from exposure concentrations 2-3 orders of magnitude below the OSHA permissible exposure limit is partly due to the improvements in environmental quality that have been achieved over the past few decades. (https://bit.ly/4o2q77N )
Managing air quality in warm sunny places is challenging as this experience in Texas shows (https://bit.ly/42oYsWr). The photochemical effects make the same emissions more harmful than they would be in a colder, cloudier place. But this isn’t new; the methods for improving air quality are mostly well understood, and lessons from other cities show that rapid progress can be made. The public health burden of ozone and other pollutants is considerable as it results in additional asthma, cardiovascular disease, and premature death. Individuals can’t really control their exposure to air pollution by themselves, but upgrading technology and changing practices can collectively have incredibly positive impact.
New study from American Lung Assocation suggests that replacing industrial boilers with heat pumps by 2050 would prevent 77,200 premature deaths and 204,000 asthma cases (https://bit.ly/46HchSF). Electrifying building and some process heat doesn’t eliminate all emissions as most are transferred back to the power plants on the local grid, but power plants are more efficient, better maintained, and produce fewer pollutants per unit of energy than small scale installations like oil or natural gas boilers for individual buildings. This amounts to 6 prevented cases of asthma and 2.3 prevented fatalities per replaced boiler. Most of this technology is ready today and helping people breathe easier is an immense benefit to quality of life and productivity.
We really need rapid response risk maps available to media in the event of major pollution events like the refinery fire in El Segundo, CA this past Thursday (https://bit.ly/4gZEcke). In the same way as we do with hurricanes, people need to know if they are in a region of concern or not at a glance, and maps work better than text descriptions now that most have smartphones. Knowing current wind patterns, adding data from sensor networks where available, and previous similar events (fires, releases, spills, etc.) can be enough to make quick predictions in minutes of where people need to take action or not. The EPA or NWS could do this, and it would be invaluable to emergency management. Like this case, many of these will not be major exposure events, but we should have the ability to show people to engender trust as to why they were not at significant risk.
We usually think of indoor air quality and temperature as primarily a comfort issue, a modern convenience that we should be able to do without, but that’s not the whole story. When it’s too hot, we make worse decisions (http://bit.ly/4gWaqNg) and take risks (http://bit.ly/48cICSy). We don’t perform as well on cognitive tests when there’s more fine particulate matter in the air (https://bit.ly/4nxaFAN) or elevated carbon dioxide (https://bit.ly/3VLVkQF), a sign of poor ventilation. And these aren’t the only relevant factors. Impacts on chronic disease also occur. Having clean, comfortable indoor air is a health and safety issue and we pay a cost when it isn’t provided in more human error.
This is my favorite International Space Station safety requirement. I sat in far too many meetings discussing what the objective of this was and what was sufficient rationale, if any existed, to waive it (we did in fact, waive it for the ARED weight machine, after a design change that limited the hole in question to a depth of about half an inch). Is a half-inch deep feature still a hole?
NHTSA delays new rules on auto pedestrian impact safety testing until the 2027 model year: https://bit.ly/4pNVTqN. Pedestrian deaths from vehicle impacts number more than 7,300 per year in the US. A value that has increased in recent years and remained stubbornly high. NHTSA fails to provide an estimate of the cost and effectiveness of this modified safety rating system, and so the quantitative cost of delay is unclear. This is likely because safety ratings and their perception by the public in the future is not obvious, but this is a poor excuse in my opinion. Uncertainty clearly exists and may be substantial, but that doesn’t mean that we won’t know the range of likely outcomes. These studies in Germany and Sweden show how such a prediction could be made: https://bit.ly/42gFm4S; https://bit.ly/42Yy5Xs.
Notifying consumers about risk is important but needs to account for exposure, potency, and dose and not just the toxic profile of the chemical in question. CA’s prop 65 did not reduce chemical exposure on average, and encouraged companies to switch to unlisted chemicals which are less understood: https://bit.ly/46KPcx0
Should a chemical that can cause cancer in 40% of individuals at dose of 1 nanogram of the chemical per kilogram of bodyweight have the same warning as one that can cause cancer in 2% of individuals at a dose of 1 gram of the chemical per kilogram of bodyweight? Does it matter if the product contains 500 grams or 5 micrograms of that chemical? Or should each just be labeled: “contains a substance that could cause cancer”? The world is complex, and simple messages are important, but not when they decrease understanding. A simple scale could account for dose, toxic potency, and uncertainty, and increase public understanding more than the current warning system.
The UK Office for Nuclear Regulation (ONR) plans a 3-phase review and update of its safety assessment principles (SAPs), with the second phase being an “AI-enhanced drafting” (http://bit.ly/4gFwlrL).To me, it’s unclear what exactly they mean by that, but this sounds like a responsible use of AI in developing safety-related documents with plenty of expert and public comment opportunities before and after this step. Where we are currently on shakier ground is when we try to employ AI to do analysis or assessment but using it to help us write more clearly is probably a good idea.
Data centers and server farms are proliferating across the United States and other countries, and so, like any new industry, their environmental impact, especially from on-site powerplants and the use of cooling water, needs to be assessed. This analysis of a Memphis data center showed relatively small increases in air pollution from burning natural gas for electricity (https://bit.ly/4mtl3bn), but that is on top of already non-compliant regional air quality. The physical location of data centers is not their most important feature from a functional perspective, so development rules should nudge installations to places that can accept the additional pollution with less of an impact on people’s health.
Airlines should do more to make sure passengers don’t try to evacuate with their carry-on bags, FAA says (https://cnn.it/3VZJy5b) | Ok, but “do more” is pretty vague… requiring passenger evacuation drills on a regular basis would help, but is not feasible for some travelers; having overhead bins automatically lock during an evacuation might help, but could also backfire; automated, repeated instructions during evacuation might help a little, but this is unclear. Experimentation is needed, but difficult to pull off as a realistic, true-to-life scenario. The FAA should fund research to help resolve this issue.
Legislative proposal to require more data sharing for autonomous vehicles is a good start (https://bit.ly/46AE4Th), but why limit such information to collisions, unplanned stoppages, and total miles? Adding “near misses” would dramatically improve the usefulness of this data.
OSHA cites a DC Circuit Court dissent (non-majority opinion) in SeaWorld vs. Perez (2014) as justification to define certain performance or professional activities as “inherently risky” and outside the scope of the general duty clause that states that employers are required to furnish a workplace “free from recognized hazards”. As written, this proposed rule change only applies to a small number of performers in the arts and athletics, which calls into question whether this is a necessary regulatory change rather than an internal agency policy on enforcement.
However, the proposal asks the public for other examples of inherently risky workplace conditions [II.1] and “welcomes comment” on defining key terms in the proposed rule (presumably including “inherently risky”) [II.6], so it is possible this will be applied to other industries in the future. It is worth mentioning that what the general duty clause requires is use of the best available processes and technology to mitigate hazards, not the elimination of entire categories of activity or key industries (rules with these kinds of effects are already prohibited).
Public comments remain open on this proposal until November 1st, 2025: https://bit.ly/4ppq47z
Are safety-related product recalls increasing? https://bit.ly/4nBrIkK Recalls indicate a failure of design and process control that put users at risk. How these recalls are quantified is probably a matter for further research, but it is possible that supply chain disruptions in the current economic climate are partly responsible. Recalls are expensive (and non-recalls of real problems even more so), and most organizations would have chosen differently if they could have known. Safety engineering and risk management is the way we quantify these issues in advance and make better informed decisions. How much value would this information have to these companies?