Every October, artists all over the world take on a challenge: make a piece of art (usually within a certain theme, using a specific media, and using a prompt list) every day for one month.
While I would not call myself an artist (though, art and science do have things in common), I took up a hobby I’d started a few months back: brushlettering or handlettering. One letter a day. And of course, I picked a science theme.
So here you go, part 1 of #Alfabetober, inspired by Carla Kamphuis (I realize that there are only 26 letters, while there are 31 days, there are some rest days).
The documentary depicts on the rise and fall of Elizabeth Holmes, founder, and CEO of the biotech company Theranos. Briefly, here is what happened:
At the age of 19, she dropped out of university and founded a company on the idea of creating a diagnostic blood test that could test for over 200 different markers using only a few drops of blood, that could be taken with a prick of a fingertip. You know, the kind they use to measure your hemoglobin when you donate blood.
The aspiration was to give the power of therapeutics and diagnostics (Theranos = THERApy + diagNOSis) to the individual, making tests significantly cheaper, less scary (no needles!), and easier for the consumer to interpret. And earlier disease detection means earlier treatment and better survival!
She founded the company in 2003 and raised over $700 million from venture capitalists and private investors in the next decade. By 2013, the company was valued at $10 billion. In 2015, the validity of technology was questioned and Holmes, and Theranos with her, fell. Three years later, in 2018, Holmes and Ramesh “Sunny” Balwani, former Theranos Chief Operating Officer and President, were both charged with fraud and conspiracy. The Theranos saga had ended.
You can read more about Theranos and Holmes’ rise and fall on the interwebs, or watch one of the several documentaries made about the story. Rather than give you the details of Theranos, I’d like to talk more about my thoughts after watching this documentary – as a bioengineer who in 2013 was studying nanotechnology, working on a project involving “theranostics” (THERApeutics + diagNOSTICS, sound familiar?) and has more recently worked in a startup environment.
Nanotechnology and Microfluidics
From 2011 and 2013, right when Theranos was about to hit its peak, I was studying nanotechnology. The technology Theranos’ system was based on (or so they claimed) was exactly the same type of stuff I was learning about. To be able to do diagnostics on small samples, the liquid handling and detection techniques need to be scaled down, using principles of microfluidics (I was also learning about that).
I distinctly remember giving a presentation about microfluidic chips that could process blood in a way to split out the different components, i.e. centrifugation at a teeny tiny scale.
From what I remember, at that time most of this technology was still in the research phase: individual university research groups and some research institutions coming up with new approaches to handle small blood samples for diagnostic testing. Were they able to do 200s of tests on the samples? Not that I know. Was any of the technology commercially viable at the time? Not that I know. But then again, I was only studying this stuff and in no way an expert.
More importantly, I don’t remember hearing about Theranos. It clearly had not made enough of a buzz, scientifically, to reach across the pond.
When I was watching the documentary, it talked a little bit about the technology and my reaction was: “That won’t work. You can try to scale down one or two of these tests to work with small samples OR you can try to do a lot more tests with the same amount of sample. But not both at the same time, that’s just sounds idealistic!”
Clearly, it didn’t. (Obviously, otherwise, there would not have been as many articles written about this, or documentaries made.)
Why we still love a genius
Part of the reason people bought into the Theranos story, is because it was enticing. This young women, who wore turtlenecks (Steve Jobs much?) and felt like she had to lower her voice to be taken more seriously, Silicon Valley just loved her. Part of being part of startup culture is being good at selling a story – whether it’s factual or not, realistic or not, is beside the point.
Another reason Theranos was successful at first and able to raise so much money was that Holmes was extremely well connected. She was charismatic. She was able to surround herself with big names, from the world of investment, military, law and even politics.
Here’s the thing, I get it. Seeing images of this young woman, who clearly is motivated by making the world a better place, who clearly made an effort to hire an inclusive and diverse workforce, who was able to found a company at such a young age, after dropping out of college no less, I would have admired her too! Making it “big” as a woman in the male-dominated tech world is not an easy feat and she seemed to have made it happen (at the time).
But here’s another thought: why do we so want to love the story of a “genius”? We love hearing about wunderkinder and how they become the youngest to do this or the first to do that. Is it because somehow we think that we too can be a genius. That we can have our own great amazing story. But most of us won’t (and that’s okay).
Maybe we should stop glorifying individuals in science, research, and tech, because especially nowadays, science doesn’t happen in isolation. Progress happens in small steps with massive groups of people collaborating to make it happen.
(Yet, we still have Nobel Prizes and love celebrating greatness.)
Why we love to see someone fall
We love to admire an individual making it happen against all odds, working so hard so they can make it, despite the system working against them. But similarly, we love seeing someone on the top fall. Perhaps we like seeing them fail so we can feel better about ourselves not “succeeding” as they did. And then didn’t
What I disliked about “The Inventor,” is that it made it seem like it was all Holmes’ fault. They spoke to some of the men that invested in her company, that believed in her, and they all tried to shift the blame to her, like she had “seduced” them into supporting her too idealistic cause.
Holmes’ surrounded herself with Yes-men. Surround yourself with people who keep telling you that you’re great, and you will start thinking you’re great. Just like being told over and over again that your not good enough for something will get to your head.
The documentary, with overly dramatic animations of blood getting into a microcentrifuge system and a dice rolling, makes Holmes look evil. And the person around her, who enabled her, as victims of her charm. And the same type of men then went ahead to explain how she failed.
Of course, Holmes’ did play a big part in this story, but she’s not the only one to blame. It’s the system that gives people the privilege of better networks and connections to succeed without really having to try. It’s the culture of Silicon Valley that celebrates taking risks when they are completely ridiculous to take. It’s the startup mentality of “Fake it till you make it,” forgetting that quite often they won’t actually make it. It’s all of us, that love hearing about wunderkinder and geniuses and celebrate the individual instead of the collective for their achievements.
How to I end this word vomit about how perhaps glorification of geniuses is perhaps not the healthiest thing?
Well, people do have good ideas. And sometimes the high-risk, high-reward world of tech and startups is the way to make these good ideas happen. And sometimes those ideas fail and we should just admit to ourselves why we enjoy watching that happen.
To quote the YouTuber MedLife Crisis (paraphrased): We shouldn’t hype medicine. Thank you vfor coming to my MedTalk.
Finally, after months of not really writing blog-related content, I leaf through the pile of articles from Science Magazine I had ripped out to find inspiration. On the very top of the pile, I find a short piece on the recently (I’m talking March 2020) discovered fossil of Oculudentavis khaungraae – the tiniest dinosaur. Or is it?
Is it is a bird? Is it a lizard?
While doing research, I quickly discover that the original paper was redacted in July – that’s what I get for getting behind on blogging I guess.
In short, the paper published in March describes the discovery of a tiny head (7 mm long) embedded in amber, which was categorised as a bird-like dinosaur making it the tiniest dinosaur ever found. This creature would probably have been about the size of the smallest living bird, the bee hummingbird. The researchers noted that the creature had large eye sockets (Oculudentavis means “eye-tooth bird” so they would have been big on eyes, and toothy), like modern lizards.
Finding the tiniest dinosaur would have been pretty cool. But in June the paper was taken down – apparently, new evidence had come to light showing that the fossil might have not been a dinosaur, and therefore not some type of prehistoric bird, but an unusual lizard.
Despite the etymology of the word dinosaur (“terrible lizard”), dinosaurs are actually more related to birds than they are to modern-day lizards. While the word “dinosaur” does get used as an umbrella term to describe prehistoric reptile-like creatures and depicted as such in children’s books and blockbuster movies, dinosaurs, including the feathered type that survived the mass extinction of 65 million BCE and eventually evolved into what we now know as birds, and reptiles are different things.
Dinosaurs (including birds) do have a common ancestor with reptiles: crocodiles, lizards, snakes, and such: this common ancestor is the archosaur. Crocodiles and other reptiles branched off in the evolutionary tree.
If you find an ancient prehistoric reptile-like fossil, you can tell whether you are looking at a dinosaur or a prehistoric reptile by looking at the hips – for as the ancient saying goes, hips don’t lie. Reptiles have a sprawling stance: their legs connect to the hips on the sides. Dinosaurs however have an upright stance: their legs connect to their hips straight under the body, just like birds – which makes sense because birds are dinosaurs!
I should add that the exact classification of dinosaurs and its subgroups are not entirely agreed on. So if you are a bit confused, you’re not alone. And if you, like all of the Jurassic Park/World franchise, want to call awesome, terrible, sometimes gigantic, extinct, reptile-like creatures by the name Dinosaur, I won’t stop you.
The teeniest dinosaur, but not really
For the fossil found in amber, however, the new fossil data (not yet published) apparently proves that it is not the teeniest dinosaur. Instead, it should be classified as a lizard, albeit an unusual one.
I could end there, but I want to mention one more controversy that I found while looking into this tiny dinosaur debacle, which brings up some of the ethics of fossil mining. These fossils were found in amber mines in Myanmar, mines that are situated in a military conflict zone and riddled with landmines. In addition, these amber mines mostly consist of long tunnels that are dangerous for the miners to work in, and many of the miners work under horrific and exploitatory conditions. You can read more about these ethical concerns here: http://markwitton-com.blogspot.com/2020/03/the-ugly-truth-behind-oculudentavis.html.
I’ve been gone, but that does not mean I haven’t been writing! I’ve been testing out some more comedic writing styles, which you can find published (!) on DNAtured (for science-related topics) and The Foreigner Blog (for non-related topics). You can read the here:
“Hoezo” is Dutch word meaning “How so?” or “Why?”, and also the name of a popular science quiz that was on TV during my teenage years. From which I distinctly remember that we find blue foods generally yucky looking because a lot of molds are blue and that we think mirror pictures of ourselves look better because that’s what we’re used to seeing (as opposed to other people thinking non-mirrored pictures are more flattering).
The word also kind of sounds like “Ouzo” – an anise-based liquor from Greece that has the cool property of being clear until you add water, aptly named “the Ouzo effect“
The Ouzo Effect
For the ouzo effect to occur we need three components: an oil, a water, and an alcohol.
The alcohol (in this case ethanol) and anise oil (also known as anethole) can be mixed. Same for ethanol and water. But anethole and water don’t mix very well: oils are generally hydrophobic.
When you add water to an anise-based alcoholic drink, such as Ouzo but other examples include Pastis and Absinthe, the liquid turns from clear to milky. By mixing these three liquids together, two of which don’t mix well, you create an emulsion: little oil micro-droplets suspended in the liquid.
Usually, oil-in-water emulsions are highly unstable, but in the case of this delicious drink* the emulsion is highly stable, making it of special interest for colloid researchers to study things like nano-droplet and micro-emulsion formations.
Some of the most efficient flying creatures in nature are flying insects. For the limited amount of neurons they have, they are incredibly competent in terms of locomotion, navigation, and maneuverability. For a roboticist working in microscale flight, creating an autonomous flying device as small, light, and versable as an insect is the dream.
Therefore, it is not a surprise that researchers study insects to improve their mini flying robots.
One example is a small quadcopter drone developed by Nakata et al., that inspired its collision avoidance system on the southern house mosquito. The researchers hypothesized that mosquitos actively sense sound and airflow specifically changes in the air patterns created by their wings as they move close to an obstacle.
Based on this system, the researchers designed a small drone that would sense an obstacle coming close, and automatically course-correct using this low-power sensing method.
Robotics + entomology = robontomology?
Creating flying robot-insects is not the only reason roboticists are interested in insects. The intersection between robotics and entomology can also be useful to better understand insect behavior.
For example, in an effort to answer the more basic question of how flying insects navigate in their environment, traditional methods proved to be quite limiting. Tethering an insect predictably interferes with flight, as does confining the insect to a room where tracking cameras can monitor their flight. In comes robotics: an open cage mount with an autonomous tracking camera (reactive controller), giving the flying insects free range to zoom, while being able to track the complex flying patterns of moths, fruit flies and mosquitos flying up to 3 m/s.
In other research, robot-insect hybrids can help understand insect brain function. By linking an insect brain to a small mechanical robot, the sensing response of different insects can be closely studied. For example, a Mantis-bot has been used to unravel the mechanism of mantis’ visual sensing and subsequent motor response.
The educational project BackYard Brains, which uses fun DIY experiments to explore the function of neurons and brains, also uses this robinsect approach to show how electrical impulses can control cockroach movement.
Okay, no, rather than having a insect-sized robot walking around and taking pictures, the researchers made a considerably lighter camera-backpack that beetles could walk around and take pictures with! A big bottleneck for insect-sized-robotics is that these gadgets require power, and batteries are kind of heavy. So by reducing the gadget to a steerable arm with a camera on it on the back of a beetle, rather than making a whole robot that needs to move around and maneuver, the researchers managed to cut down signficiantly on the weight.
Also, it’s cute as hell!
Thanks for the robot-insect update, Valerie. But what about the Killer Bees?
For the Black Mirror fans, not to worry, no-one is making swarms of bees (yet).
Sources and original research papers linked throughtout the text.
If you’re ever done any cell culture, whether in a biology course, during grad school, or in an industrial research setting, chances are you’ve worked with HeLa cells.
About a week ago, I started drafting this post after my supervisor mentioned “I could just use any cell to test [a new protocol on], like, even HeLa cells.” Then today, via the Instagram account @womenengineerd, I learned Henrietta Lacks was born exactly 100 years ago (+ 3 days). So, it feels even more important to highlight this story: what are HeLa cells, who was Henrietta Lacks, and why is this all so important?
The source of a cell
In 1951, a poor Black woman went to the Johns Hopkins Hospital with cervical cancer. Without asking for permission, the doctors took some of the tumor cells to study and made a remarkable discovery: these cells continued to grow and survive in culture. They were immortal.
Later that year, that woman died, but her cells lived on for decades, and will likely continue to live on for many more. That woman’s name was Henrietta Lacks, and the cells she provided are a staple in practically every cell biology lab: HeLa cells.
An immortal cell
Immortalized cells are incredibly useful for biological research. They can be taken from cancer biopsies (now with consent!) or created by inducing mutations in other cells, in both cases giving the cells the potential to live on forever.
Researchers can continue to grow them in culture, and use the for biological, biochemical, pharmaceutical, and biotechnological research. They are easy to work with, don’t really require any special attention because they just want to grow, grow, grow.
HeLa cells were the first cells that were immortalized, and have been used extensively ever since they were taken from Henrietta Lacks.
The legacy of Henrietta Lacks is immense. A search for “HeLa cells” on Google Scholar prompts 1,730,000 search results (not that this is an accurate estimate of actual research conducted with HeLa cells), and over 17,000 US patents use HeLa cells.
From my personal experience, it seems that HeLa cells are used everywhere, from undergrad cell biology labs to ground-breaking research in both academia and industry. It’s hard to say for sure how influential this one cell type has been, or how much money it has made the companies selling them.
The irony of immortality
But while HeLa cells have been one of the most important ingredients for modern biology, neither Henrietta Lacks nor Lacks’ family recieved any of the benefits. It was not until the 70s that her family was even informed that their relative’s cells were used in such a widespread way. Furthermore, HeLa cells were bringing in the big bucks, while her family had little money (ironically, some of them could not afford health insurance).
As I’ve stated, I’ve used HeLa cells. Cells that were extracted from a Black woman without her knowledge or her consent. Cells that have made companies millions, without any contribution to her family. Cells that have helped us understand basic biology and the function of genes and proteins in our body, that have helped develop new medicines and treatments for cancer, that have taught many of us the principles of cell culture, all without teaching us their origin story and problematic history.
I’m not saying we should no longer use certain cells, but we should be at least be aware of potential problematic histories. Johns Hopkins University has been working with the Lacks family to honour Henrietta’s legacy, since the 50s, standards of consent and research ethics have been established, and Henrietta’s story is more widely known thank to a book and a movie.
Nevertheless, as far as I can find, her descendants have not been compensated in any way. In a 2017 interview, her grandson Ron stated: “It’s not all about the money. My family has had no control of the family story, no control of Henrietta’s body, no control of Henrietta’s cells, which are still living and will make some more tomorrow.”
So, right after what would have been her 100th birthday, what can we do to give control back to her family?
We all know Wikipedia. It’s almost impossible not to.
For me, from a quick look-up of some fact to prove your point in an argument with friends, to double-checking a chemical structure for schoolwork, or to translate an obscure plant name I can’t think of the English name for; I’ve used Wikipedia consistently for well over a decade.
I’ve always known that Wikipedia was an online encyclopedia than anyone could edit. But I’d never even considered making an edit myself. Until one day in April, I received an email from 500 Women Scientists with the opportunity to attend a 6-week wiki-editing course. I’d already been working from home for a few weeks, with a considerably lower workload than usual, and – to be honest – not quite sure what to do with myself. So, I jumped on the opportunity to learn how to use the skills I already have — hey, I’m a scientist, I’ve been researching and writing and fact-checking for years! — to make Wikipedia a more inclusive place.
500 Women Scientists Wikipedia
About 10 women scientists gathered twice a week to learn how to edit Wikipedia with one main goal: putting more women on Wikipedia. I was saddened, but not surprised, to learn that of all the biographies on Wikipedia, only ~18% are about women. That percentage is ~16% if we only look at academic biographies, and it drops down to ~6.5% for female engineers, my own field.
One potential reason for this is that a lot of Wikipedia editors are men. And – likely due to implicit bias – they write and edit articles about… other men. Even if the academic world is becoming more inclusive, this isn’t necessarily reflected on the online encyclopedia that everyone uses.
And that’s a problem. Middle or high schoolers looking to learn more about notable figures in a field of interest and don’t find anyone who looks like them or comes from a similar background, might be turned off from pursuing studies in that field. So that’s where 500 Women Scientists Wikipedia comes in. By increasing representation of women in the academic biography category of Wikipedia, either by improving existing articles or writing new ones (for example through the Women in Red Wikiproject, which aims to write articles for “redlinked” women), we could improve representation and therefore make Wikipedia a better and more inclusive resource.
That all sounds good, but how?
Okay, so I knew I wanted to make Wikipedia more inclusive and I knew why, but that didn’t really help me with the “why.” Again, the fact that anyone can edit, doesn’t make me feel comfortable doing so right away! Luckily, the WikiEducators (if that’s the term, the course was organized by WikiEducation, and everything related to Wikipedia seems to have “Wiki” in it!), walked us through the core policies of Wikipedia, the do’s and don’t, and helped us through our first article edits.
Here is a list of things that stuck (but you can find all that is relevant to editing Wikipedia, on – you guessed it! – Wikipedia):
Statements on Wikipedia must be verifiable, which does not mean they are necessarily true. It just means there’s a sourceable body of work to back up the statement. This feels counterintuitive (shouldn’t we be writing “the truth”?) but it ensures there are reliable sources for everything on Wikipedia.
Wikipedia is not a place for opinion; articles should reflect a neutral point of view. I did like that this meant according to consensus, as opposed to the journalistic rule of equal time. For example, if 90% of climate researchers are in agreement that climate change is real, that viewpoint should be reflected for 90% of the article.
To have a biography on Wikipedia, a person must be notable. They have to meet criteria with regards to their academic achievements, prizes won, and impact to merit a presence on the online encyclopedia. In an academic culture where men are typically still more valued than women, this can be another factor for why there are so few biographies about women on Wikipedia.
The definition of Wikipedia as an “online encyclopedia” is incredibly broad, and apparently it’s easier to define what Wikipedia is not.
You can contribute to Wikipedia in several different ways, whether it’s writing new content, taking care of layout, correcting spelling and grammar, or making Wikipedia more aesthetically pleasing (just to name a few).
Making the first edit
The first edit was scary!
What if I made a mistake? What if I undid someone else’s edit and step on their toes? What if I did something that was inherently anti-Wikipedian?
Wikipedia’s mantra is “Be Bold” – make the change! The beauty of a massively open, crowd-sourced, and peer-reviewed platform is that almost everyone there is willing to help. It’s not seen as a faux-pas to make mistakes, and if you do, someone else will come along and fix it. Accidentally left in a typo? Someone will fix it. Mistakenly got a fact not quite right? Someone will fix it. Change someone’s important edits without noticing? They can come back and undo your change. And Wikipedia keeps track of all the changes in the “history” tab, making the whole editing process transparent and traceable.
Working on the second article was considerably easier. Sure, there are still some really tricky things, like adding images or editing boxes, but overall making edits on Wikipedia is really easy!
“So fix it”
Another Wikipedia Mantra is “So fix it”: if you see something wrong, make it better.
If you see a lack of representation, write a new article. Make existing articles better (I was surprised to learn about how some articles in the outer corners of Wikipedia are not great). Increasing representation is not just about getting more women biographies on Wikipedia. Black, Indigenous and People of Color academics are more underrepresented on Wikipedia than they are in academia (thanks to the #editWikipedia4BlackLives effort on June 10th and ongoing efforts from the people involved, that will hopefully change), and Pride month brings LBGTQA+ themed “editathons” (sessions where groups of people edit pages together). Wikipedia is a group effort, and together we can all make Wikipedia better: more representative, more inclusive, and more equitable. I myself plan to edit or write one article a week! 💪
In the last few months, a lot of us have been confined to our homes. We no longer commute daily to our workplace, spend less time stuck in traffic, and have canceled our travel plans. With fewer cars on the road, airplanes in the sky, and shut down of some industrial activities, global CO2-emissions are likely to have decreased. In fact, a recent paper has estimated the emission reductions based on predictive models and reported on a daily globar CO2-emissions decrease by ~17% by April 2020 compared with mean 2019 levels.
The same paper predicts that the total average emission of 2020 will decrease somewhere between 4% and 7% compared to the 2019 average depending on the duration of confinement.
It is agreed upon by most of the scientific community, that changes in the amount of CO2 in the atmosphere have an effect on global temperature, this fact will likely not surprise you. But perhaps it will surprise you that this has been known for a long time.
Not just for a few decades. But for two centuries.*
Climate science in the 19th century, yes, it was a thing.
In the early 19th century, scientists had a suspicion that the earth’s atmosphere had the ability to keep the planet warm by transmitting visible light but absorbing infrared light (or heat), and that human activity could change the atmosphere’s temperature, including Joseph Fourier, who mentioned “the progress of human societies” having the potential to – in the course of many centuries – change the “average degree of heat” in an 1827 paper.
In 1859, Fourier’s theoretical musings were turned into experiments, when John Tyndall, an Irish physicist, published his study investigating the absorption of infrared in different gases. This was the first** experiment showing how heat absorption by the atmosphere could lead to temperature rises, and that certain gasses absorb more heat than others, such as water vapor, methane, and CO2.
Three years earlier…
But wait! Three years before Tyndall’s paper, another paper had appeared in the American Journal for Science and Arts: Circumstances affecting the Heat of the Sun’s Rays, showing how the sun’s rays interacted with different gases, concluding that CO2 trapped the most heat compared to air and hydrogen. The paper was by a woman named Eunice Newton Foote.***
Now, years after her experiments and findings, Foote is credited to be the first scientist to have experimented on the warming effect of the sun’s light on the earth’s atmosphere and the first to theorize that changing levels of CO2 would change the global temperature. In her paper, she stated that:
“An atmosphere of that gas would give to our earth a high temperature; and if, as some suppose, at one period of its history, the air had mixed with it a larger proportion than at present, an increased temperature from its own action, as well as from increased weight, must have necessarily resulted.”
Foote (1819-1888) was a farmer’s daughter and lived in a time where women were typically not considered scientists. She did not have a sophisticated laboratory, so her experimental setup was rather amateurish compared to Tyndall’s a few years later. When her results were presented at the American Association for the Advancement of Science conference, it was not by her, but by Professor Joseph Henry of the Smithsonian.
While she gained some recognition for her work at the time, it was rather limited and forgotten by history. Henry presented her work at the conference, prefacing the talk with: “Science was of no country and of no sex. The sphere of woman embraces not only the beautiful and the useful, but the true.” and she was praised in September 1856 issue of Scientific American titled “Scientific Ladies.”
It wasn’t until 2010, however, when her paper was rediscovered by a retired petroleum geologist, that her name was slowly put back on the climate science map.
“She had three strikes against her. She was female. She was an amateur. And she was an American.”
There weren’t very many female scientists at the time. Women had a hard time getting formal (science) education.
She did not have a traditional science education and her experimental setup was nowhere near the sophistication of Tyndall’s. her experiment was a lot more simple than Tyndall’s and was limited in its results: she was not able to distinguish between visible and infrared radiation. But her serendipitous discovery that CO2 traps more heat than the other gases she tested, and her hypothesis about changing atmospheric CO2 affecting global temperature, were the first of their kind.
Finally, Europe was still the epicenter of scientific discovery at the time. The US, and physics in the US, was still very much up and coming. At the same time, communicating discoveries overseas without glass fibers and internet was just not as trivial as it is today.
For many decades, John Tyndall was considered the father of climate science, and granted, he was the first to show that certain gases absorbed more heat radiation (rather than radiation in general) than other gases. But Foote was the mother, first theorizing what we now know to be true: changing levels of atmospheric CO2 result in changes in global temperature. And now, almost two centuries later, she’s remembered for it.
So while you’re working from home and putting less CO2 in the atmosphere as a result, spare a little thought for woman scientist who first linked CO2 with temperature. And that the fact that she did, is pretty amazing.
I highly recommend this Cogito video on the history of Climate Change:
I’m currently in a room with probably about 250 records. None of them are actually mine (except perhaps this one), but the presence of this amount of vinyl has got me on several thought-trains; remembering when I was back home with my parents going through their 70s and 80s music collection while writing my thesis, or when I was basically doing the Pomodoro technique by working on thesis corrections in chunks equal to however long a side of an lp would play; wondering why people are so into vinyl; wondering how sound can possibly make it onto a piece of polymer that can be read out by a needle, and why are some vinyl records black and other super colorful?
What is vinyl? How is vinyl? Why is vinyl?
Okay, I did some digging. Luckily, I possess the ability (and currently, excess of time) to surf the internet and learn things. Through blog posts, very satisfying YouTube videos, and WikiHows, it’s easy to find out how things are made. But let’s dig a bit deeper and look into the production, chemistry and other science around vinyl.
So, go put on your favorite music, on vinyl or otherwise, sit back and read on.
Vinyl: what’s in a name?
Vinyl is a synthetic material containing a specific chemical group (surprisingly named the “vinyl group”) with the chemical formula -CH=CH2:
Vinyl is also the common name used for the polymer Polyvinyl Chloride (PVC). Polymers are long chains of molecules that have a repeating unit. Polymers are resent everywhere in nature (think proteins and DNA) as well as in man-made materials (synthetic rubbers, plastics, …). In the case of PVC, a repeating unit is a vinyl group with chloride tagged on. This plastic is made from ethylene (from crude oil) and chorine (derived from salt) and looks like a long repeating chain of:
For vinyl making, PVC arrives in pellets that can be melted together and molded into a putty (apparently called a “biscuit”). This is pressed between two plates that contain the negative pattern of the grooves containing the music. Wait… let’s start from the beginning…
The making of a record: Cut, Plate, Press
Vinyl records are made through the process of cutting, plating, and pressing.
In the first step, recorded sound is etched onto what is called a lacquer disc, which is a flat aluminum disk coated with a layer of nitrocellulose lacquer (basically a layer of nail polish). Recording machines (called lathes) have a very sharp very very sharp heated sapphire tip that will cut grooves into the surface of a blank lacquer disc due to the vibrations of the recording. This lacquer disc can theoretically be played back (which is also done for quality checking), but the material is too delicate for repeat plays.
It is ideal, however, to create a stamper, the mold used to make vinyl records. To make a stamper, the lacquer disk is first coated with a silver solution. Then, this shiny disk is immersed in a tin or nickel chloride bath for electroplating: tin or nickel particles in the solution are attracted to the silver particles coating the lacquer disk and form a metal layer. This layer will have the opposite structure compared to the lacquer disk: instead of grooves, the physical representation of the sound will be protruding out.
The stamper is then used to press a bit of vinyl putty into a finished record. The two stamper sides (one for A and one for B) are heated to ~ 180°C (~ 350F). The malleable PVC biscuit is placed between the stampers, which are then pushed together by hydraulic pressure – imprinting grooves onto the PVC. Excess PVC spilling over the edges is cut off, et voila, the record is ready to play. This pressing process takes less than 30 seconds and thus the same mold can be reused to make a whole pile of vinyl records with the same music!
PVC is colorless in it’s raw form, so to create that typically black record look, PVC pellets are colored using carbon black. This is the same material that makes car tires black. In cars, it has the excellent properties of being conductive – making sure there’s no static electricity building up in your tires – and of making rubber sturdy. In vinyl, it also reinforces the polymer making the material stronger and more stable over time, ensuring that you can play the record time and time again with the same sound quality (-ish).
Now, PVC pellets can be colored to create different colors using different dyes. Historically, these dyes would not have the same reinforcing effect as carbon black, but nowadays the difference in quality is negligible. In fact, production mistakes have a bigger effect on sound quality and durability than leaving out carbon black.
Edit on 4/24/2020: This is extremely satisfying and relevant:
Another option is picture disks, which consist of 3 distinct layers: one layer is a clear PVC layer without any music, the second layer has the picture, and the third layer is a clear plastic sheet containing the grooves for the music. This final plastic layer is more malleable than PVC and therefore not as durable; picture records (and glow-in-the-dark records) are more susceptible to loss in durability and sound quality but you’d have to be a real expert to really notice.
Keep on turning
There you go, you have all the information you need to become a vinyl record collector. And impress other record collectors with your knowledge on vinyl. Shall we talk LaTeX next time?
Sources linked throughout the text. Cover image is from Bit.Trip’s “Greatest Chips”
We can’t hold public gatherings anymore. So conferences and meetings are moving to virtual, which is… interesting?
Last month, I attended Science Talk 2020 (#SciTalk20), an annual conference about everything that’s science communication that’s usually held in Portland, OR. Not this year. This year is was on the internet.
I’ve never been – it’s passed on my radar the past few years, especially because Portland isn’t that far, but the combination of no longer being a student (so no student attendance fees) and the time/effort/cost of travelling (let’s face it, sometimes I’m just lazy), meant I never made the trip down.
This year however, there was no trip required, and I knew I’d probably have the time to attend (two afternoons), so why not? I love the scicomm community on Twitter and this could be a new way to connect.
I like attending conferences, but sometimes I’m just so tired at the end of the day from always being on. I enjoyed going to the #AAAS2020* meeting partially because I could just go home straight after. Sure, part of conferences – and I might argue perhaps one of the most important parts – is networking, those coffee breaks and meet-ups in bars and connecting over drinks, but attending a conference from your lazy desk chair has some perks:
You can get up and grab a coffee or go to the bathroom whenever you want without feeling like you’re bothering the speaker by getting up.
You can shamelessly doodle, knit, cross-stitch, … whatever type of “mindless” activity you like without feeling self-conscious. I particularly like this, because even during the most interesting of talks, I have the tendency to fall asleep, and doing something with my hands helps me stay awake.
You don’t have to dress up. Well, attend a conference in your PJs. Super comfy. You don’t even need to pack!
The catering is as amazing as you make it!
One of my favorite things of the conference was the chat room, similar to the chat in a live-streamed YouTube video: constantly running in the background. It was pretty amazing to talk (mostly about the ongoing session but sure, there were also jokes) without bothering the speaker, at another conference, whispering in the back row would be frowned upon.
The chat room gave attendees the opportunity to network and provide resources directly. A lot of questions came up live, discussions got started, etc. It was like having a live tweet feed but a bit faster. In addition to the live-streamed speaker sessions, coffee breaks (with a chat open) gave people to opportunity to connect, discuss, and joke around.
So should all conferences go virtual?
Nah, of course not. There are aspects to in-person conferences that would be very difficult to implement virtually, such as networking events, (some) interactive workshops, and exhibition halls. But live-streaming can definitely make conferences more interactive, and accessible. Rethinking how conferences are organized can potentially increase their impact: can some conferences completely or partially be held online to reach more people? Do we really always have to travel halfway across the world for a meeting?
The organizers of #SciTalk20 showed that moving a meeting online in a matter of weeks is possible, with great speakers, wonderful attendees, and a disco party to end with.
* The annual meeting of the American Association for the Advancement of Science. You can read some of my session reports here, here, and here.