Blog

Currently Happening Presently Now: ZOMBIES

3/31/14

Untitled 4440
Newitz, A. (2006). Pretend we’re dead: Capitalist monsters in American pop culture. Duke University Press.

In Pretend We’re Dead, Annalee Newitz argues that the slimy zombies and gore-soaked murderers who have stormed through American film and literature over the past century embody the violent contradictions of capitalism. Ravaged by overwork, alienated by corporate conformity, and mutilated by the unfettered lust for profit, fictional monsters act out the problems with an economic system that seems designed to eat people whole.

Newitz looks at representations of serial killers, mad doctors, the undead, cyborgs, and unfortunates mutated by their involvement with the mass media industry. Whether considering the serial killer who turns murder into a kind of labor by mass producing dead bodies, or the hack writers and bloodthirsty actresses trapped inside Hollywood’s profit-mad storytelling machine, she reveals that each creature has its own tale to tell about how a freewheeling market economy turns human beings into monstrosities.

Newitz shows that as literature and film tell it, the story of American capitalism since the late nineteenth century is a tale of body-mangling, soul-crushing horror.


Lauro, S. J., & Embry, K. (2008). A zombie manifesto: the nonhuman condition in the era of advanced capitalism. BOUNDARY 2, 35(1), 85.

The zombie has been one of the most prevalent monsters in films of the second half of the twentieth century...The zombie has become a scientific concept by which we define cognitive processes and states of being, subverted animation, and dormant consciousness. In neuroscience, there are “zombie agents”; in computer science there are “zombie functions.”...The ubiquity of the metaphor suggests the zombie’s continued cultural currency, and we will investigate why this specter has captured the American imagination for over a century. We want to take a deeper look at the zombie in order to suggest its usefulness as an ontic/hauntic object that speaks to some of the most puzzling elements of our sociohistorical moment, wherein many are trying to ascertain what lies in store for humanity after global capitalism—if anything.

Our fundamental assertion is that there is an irreconcilable tension between global capitalism and the theoretical school of posthumanism. This is an essay full of zombies—the historical, folkloric zombie of Haitian origin, which reveals much about the subject position and its relationship to a Master/Slave dialectic; the living-dead zombie of contemporary film, who seems increasingly to be lurching off the screen and into our real world (as a metaphor, this zombie reveals much about the way we code inferior subjects as unworthy of life); and finally, we are putting forth a zombie that does not yet exist: a thought-experiment that exposes the limits of posthuman theory and shows that we can get posthuman only at the death of the subject.  Unlike Donna Haraway’s “Cyborg Manifesto,” we do not propose that the position of the zombie is a liberating one—indeed, in its history, and in its metaphors, the zombie is most often a slave...


In the figure of zombie, subject and object are obliterated. This figure, simultaneously slave and slave rebellion, is a more appropriate reflection of our capitalist moment, and even if it holds less promise than a cyborg future, its prophesy of the posthuman is more likely to come to fruition. The zombie, we feel, is a more pessimistic but nonetheless more appropriate stand-in for our current moment, and specifically for America in a global economy, where we feed off the products of the rest of the planet, and, alienated from our own humanity, stumble forward, groping for immortality even as we decompose.

Michel, F. (2007). Life and Death and Something in Between: Reviewing Recent Horror Cinema. Psychoanalysis, Culture & Society, 12(4), 390-397.

I argue that popular zombie films can be seen as expressions of the return of repressed cultural concerns about class antagonism and about the violent and deadening effects of modern life...

The observation that it is ‘‘easier to imagine the end of the world than the end of capitalism’’ is often cited with the implication that this is surprising or illogical or ironic (Jameson, 2007, p 199). But of course it is easier to envision the end of the world because doing so requires chiefly extrapolating from existing conditions, continuing in the directions we are already going. In contrast, imagining the end of capitalism requires envisaging substantial changes in existing structures, institutions, and habits...

Gothic horror emerges as the obverse of the Enlightenment: Victor Frankenstein’s scientific research yields a creature that seems ‘‘my own vampire. . . let loose from the grave’’ (Shelley, 1969, p 77). Vampires develop with industrial capitalism. Marx famously describes capital as ‘‘dead labour which, vampire-like, only lives by sucking living labour, and lives the more, the more labour it sucks’’ (Marx, 1976, p 342). But if capital is vampiric – like the aristocratic Dracula with his new property in England – then members of the laboring class are zombies. Historically, that is, the cultural associations with zombies align them with workers...

Borg, M. B. (2005). A zombie storms the meathouse: Approximating living and undergoing psychoanalysis in a palliative care culture. Psychoanalysis, Culture & Society, 10(1), 1-22.

Human – that is, emotional – responses to everyday stimuli are increasingly pathologized, and we are increasingly promised the obliteration of all personal suffering. Yet at the core of all these human responses to suffering that need remedy is a deep sense of empathy with the struggles associated with simply existing at this time in this society, in a state of perpetual dread over the immense social problems that infect those around us, and that seem (and often are) insurmountable. It seems in our society it is the experience of empathy that is most feared, most defended against, and most abstained from, as if compassion is the ultimate contagion that, if experienced in full force, would lead to break down.

..the definitive aim of life in such a society is to promote fulfillment (in the form of increased satisfaction and decreased discomfort or pain) for individuals instead of communities. Other authors have analyzed the pervasiveness of this perspective in many social and cultural institutions (including psychotherapy and the social sciences in general) in capitalist societies, particularly the United States. As a character defense on a societal level, liberal individualism allows us to avoid acknowledging the social consequences of a sanctioned perspective that supports fulfillment for some and suffering for others. As this perspective becomes rigidified, it functions as an ideology.

Consistent with such a notion of ideology, control and allocation processes are dissociated, problems and fulfillments become increasingly individualized, separated from their social and cultural influences and etiologies. The inability to empathize with both self and other diminishes the potential for subversive or revolutionary processes to exert any impact on societal transformations or daily functions.Today there are many examples of compassion being pushed aside in favor of rules and regulations that diminish our awareness of our collective fragility..

Stratton, J. (2011). Zombie trouble: Zombie texts, bare life and displaced people. European Journal of Cultural Studies, 14(3), 265-281.

There has been a recent upsurge in texts featuring zombies. At the same time, members of western countries have become increasingly anxious about displaced peoples: asylum-seekers and other so-called illegal migrants who attempt to enter those countries. What displaced people, people without the protection of the state and zombies have in common is that both manifest the quality of what Giorgio Agamben calls ‘bare life’. Moreover, zombies have the qualities of workers or slaves driven to total exhaustion. The genre of the zombie apocalypse centres on laying siege to a place that is identified as a refuge for a group of humans. In these texts it is possible to read an equation of zombies with displaced people who are ‘threatening’ the state. Indeed, the rhetoric used to describe these people constructs them as similar to mythical zombies...

Mahoney, P. (2011). Mass Psychology and the Analysis of the Zombie: From Suggestion to Contagion. Generation Zombie: Essays on the Living Dead in Modern Culture, 113-29.


 


Be the first to post a comment.



Currently Happening Presently Now: AUTISM

3/30/14

Untitled 4450
Den Hartog, J. E., & Moro-Ellenberger, S. L. (2005). How do you measure a child? A study into the use of biometrics on children. TNO, Delft.

Mitka, M. (2010). Rising autism rates still pose a mystery. JAMA, 303(7), 602-602.

"... Researchers now believe that early intensive intervention is a key to improving outcomes for children with an autism spectrum disorder..."

Camarata, S. (2014). Early identification and early intervention in autism spectrum disorders: Accurate and effective?. International journal of speech-language pathology, 16(1), 1-10.

Over the past decade, there has been increased interest in identifying autism and autism spectrum disorder (ASD) in toddlers. Although there is a strong rationale for identifying ASD early and delivering effective intervention, a recent report in the journal Pediatrics raises important questions about the scientific evidence currently available supporting early intervention. In addition, the British National Health Service (NHS) has not adopted universal screening for autism, even though the American (US) Academy of Pediatrics endorsed a recommendation that all toddlers be screened for ASD by the age of 24 months (in 2007). The goal of this initiative is to identify and, where indicated, provide early intervention for autism and ASD. Although it is inarguable that this is a worthwhile and laudable goal, the systematic study of this goal is confounded by the inherent difficulty in reliably identifying autism in 24-month-old toddlers. It is challenging to demonstrate intervention effects in the absence of randomly assigned control groups in an increasingly heterogeneous ASD population. The purpose of this paper is to examine the current literature on early identification and early intervention in autism and ASD and to provide a framework for examining these issues.


Wing, L., & Potter, D. (2002). The epidemiology of autistic spectrum disorders: is the prevalence rising?. Mental retardation and developmental disabilities research reviews, 8(3), 151-161.

For decades after Kanner's original paper on the subject was published in 1943, autism was generally considered to be a rare condition with a prevalence of around 2–4 per 10,000 children. Then, studies carried out in the late 1990s and the present century reported annual rises in incidence of autism in pre-school children, based on age of diagnosis, and increases in the age-specific prevalence rates in children. Prevalence rates of up to 60 per 10,000 for autism and even more for the whole autistic spectrum were reported. Reasons for these increases are discussed. They include changes in diagnostic criteria, development of the concept of the wide autistic spectrum, different methods used in studies, growing awareness and knowledge among parents and professional workers and the development of specialist services, as well as the possibility of a true increase in numbers. Various environmental causes for a genuine rise in incidence have been suggested, including the triple vaccine for measles, mumps and rubella (MMR]. Not one of the possible environmental causes, including MMR, has been confirmed by independent scientific investigation, whereas there is strong evidence that complex genetic factors play a major role in etiology. The evidence suggests that the majority, if not all, of the reported rise in incidence and prevalence is due to changes in diagnostic criteria and increasing awareness and recognition of autistic spectrum disorders. Whether there is also a genuine rise in incidence remains an open question.

Shattuck, P. T. (2006). The contribution of diagnostic substitution to the growing administrative prevalence of autism in US special education. Pediatrics, 117(4), 1028-1037.

"..Prevalence findings from special education data do not support the claim of an autism epidemic because the administrative prevalence figures for most states are well below epidemiological estimates. The growing administrative prevalence of autism from 1994 to 2003 was associated with corresponding declines in the usage of other diagnostic categories."

Matson, J. L., Wilkins, J., & González, M. (2008). Early identification and diagnosis in autism spectrum disorders in young children and infants: How early is too early?. Research in Autism Spectrum Disorders, 2(1), 75-84.

"..The prevailing wisdom appears to be that early identification is good, and the earlier the better. However, the arguments put forth, while compelling, are largely unsubstantiated by data. Researchers need to establish if early identification, for example, results in better family support..."

Perrin, E. C., Newacheck, P., Pless, I. B., Drotar, D., Gortmaker, S. L., Leventhal, J., ... & Weitzman, M. (1993). Issues involved in the definition and classification of chronic health conditions. Pediatrics, 91(4), 787-793.

The need for a widely applicable definition of chronic conditions for research, policy, and program development has led to an extensive review of the development of such definitions, the considerations involved in their use, and some recommendations for a new approach. This paper examines some of the methodologic and conceptual issues related to defining and classifying chronic conditions and describes some consequences resulting from decisions made about these issues. While most examples are taken from child health applications, the basic concepts apply to all age groups. The dominant method for identifying and classifying children as having a chronic condition has relied on the presence of an individual health condition of lengthy duration. This condition-specific or "categorical" approach has increasingly seemed neither pragmatically nor conceptually sound. Thus, the development of a "generic" approach, which focuses on elements that are shared by many conditions, children, and families, is recommended. Such a definition might reflect the child's functional status or ongoing use of medical services over a specified time period. In addition, it is suggested that conditions be classified based on the experience of individual children, thus emphasizing the tremendous variability in expression of seemingly similar conditions.

Johnson, C. P., & Myers, S. M. (2007). Identification and evaluation of children with autism spectrum disorders. Pediatrics, 120(5), 1183-1215.

"...The prevalence of autism and, more recently, ASDs is closely linked to a history of changing criteria and diagnostic categories. Autism first appeared as a separate entity with specific criteria in the DSM-III in 1980. In 1987, the Diagnostic and Statistical Manual of Mental Disorders, Third Edition, Revised (DSM-III-R) listed broadened AD criteria and the new subthreshold category of PDD-NOS, both of which promoted inclusion of milder cases. Later, these changes received criticism for being too inclusive and for promoting overdiagnosis. The DSM-IV criteria published in 1994 reflected the result of years of analyses to reduce the overinclusiveness of the DSM-III-R criteria; however, it included AS for the first time, which, in effect, broadened the range of disorders..."

Federal study estimates 1 in 88 children has symptoms of autism
By David Brown, Published: March 29, 2012
The Washington Post

"...If the rising prevalence represents an actual increase of the disorders — and is not the consequence of finding previously undiagnosed cases — that suggests there may be environmental exposure or demographic change (such as older parenthood) that is responsible, because a population’s genetic background wouldn’t change over a few decades..."

Baio, J. (2012). Prevalence of Autism Spectrum Disorders: Autism and Developmental Disabilities Monitoring Network, 14 Sites, United States, 2008. Morbidity and Mortality Weekly Report. Surveillance Summaries. Volume 61, Number 3. Centers for Disease Control and Prevention.

"...Because the ADDM Network sites do not make up a nationally representative sample, these combined prevalence estimates should not be generalized to the United States as a whole.

These data confirm that the estimated prevalence of ASDs identified in the ADDM network surveillance populations continues to increase. The extent to which these increases reflect better case ascertainment as a result of increases in awareness and access to services or true increases in prevalence of ASD symptoms is not known..."


U.S. autism rate surges, CDC reports
By Lenny Bernstein, Updated: March 27, 2014 at 1:25 pm
The Washington Post

"..The CDC said it would be announcing a new initiative later Thursday to encourage parents to have young children screened for autism in their early years, and given the support they need. Officials said most children are not diagnosed until they are at least four years old, though identification is possible as early as two years old. Any parent who has concerns about how a child  plays, learns, speaks, acts or moves should seek an assessment, officials said..."

Caruso, D. (2010). Autism in the US: Social movement and legal change. Am. JL & Med., 36, 483.

The social movement surrounding autism in the US has been rightly defined a ray of light in the history of social progress. The movement is inspired by a true understanding of neuro-diversity and is capable of bringing about desirable change in political discourse. At several points along the way, however, the legal reforms prompted by the autism movement have been grafted onto preexisting patterns of inequality in the allocation of welfare, education, and medical services. In a context most recently complicated by economic recession, autism-driven change bears the mark of political and legal fragmentation. Distributively, it yields ambivalent results that have not yet received systemic attention. This article aims to fill this analytical vacuum by offering, first, a synoptic view of the several legal transformations brought about or advocated for by the autism movement and, second, a framework for investigating their distributive consequences.

Perez, V. W. (2010). The rhetoric of science and statistics in claims of an autism epidemic. Advances in Medical Sociology, 11, 203-221.

Purpose – To examine the rhetorical use of scientific medical evidence and diagnoses statistics in claims of an epidemic of childhood autism spectrum disorder.

Methodology/approach – Qualitative analysis of the content and dissemination of claims in several venues for social problems construction, including popular media, peer-reviewed scientific literature, and the Internet.

Findings – Rhetorical use of etiological evidence, both scientific and experiential, positing a causal link between medical interventions (e.g., vaccines), environmental toxins, and autism is prominent across several arenas for social problems construction. Claims and counterclaims involve statements amiable to or critical of evidence and its relationship to the scientific method. Presentation of diagnoses statistics and covariation with vaccination regimens are integral as a rhetorical device in claims of a true change in prevalence.

Contribution to the field – Elucidates how the medicalization of childhood developmental disabilities and increased lay involvement (e.g., parents) in the social problems process were vital for the proliferation of attention and resources directed to autism presently. The fundamental scientific method and the lack of sufficient, valid scientific evidence are not integral to the continuation of the movement that posits vaccines cause autism. The content of these claims is unfettered on the Internet as an arena for claimsmaking, allowing a lay social movement to continue that often stands in opposition to recognized scientific authority and evidence.


Altucher J, Save the children (and make money), The Wall Street Journal, August 10, 2009.


 


Be the first to post a comment.



Currently Happening Presently Now: ENDGAME

3/29/14

Untitled 4270
Joy, B. (2000). Why the future doesn't need us. Nanoethics–the ethical and social implicatons of nanotechnology, 17-39.

I think it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals...

As this enormous computing power is combined with the manipulative advances of the physical sciences and the new, deep understandings in genetics, enormous transformative power is being unleashed. These combinations open up the opportunity to completely redesign the world, for better or worse: The replicating and evolving processes that have been confined to the natural world are about to become realms of human endeavor.


But many other people who know about the dangers still seem strangely silent. When pressed, they trot out the "this is nothing new" riposte - as if awareness of what could happen is response enough. They tell me, There are universities filled with bioethicists who study this stuff all day long. They say, All this has been written about before, and by experts. They complain, Your worries and your arguments are already old hat.

I don't know where these people hide their fear. As an architect of complex systems I enter this arena as a generalist. But should this diminish my concerns? I am aware of how much has been written about, talked about, and lectured about so authoritatively. But does this mean it has reached people? Does this mean we can discount the dangers before us?

Knowing is not a rationale for not acting. Can we doubt that knowledge has become a weapon we wield against ourselves?

The experiences of the atomic scientists clearly show the need to take personal responsibility, the danger that things will move too fast, and the way in which a process can take on a life of its own. We can, as they did, create insurmountable problems in almost no time flat. We must do more thinking up front if we are not to be similarly surprised and shocked by the consequences of our inventions.



 


Be the first to post a comment.



Currently Happening Presently Now: BIOPOLITICS

3/28/14

Untitled 4273
"It should be possible to design a world in which behavior likely to be punished seldom or never occurs. We try to design such a world for those who cannot solve the problems of punishment for themselves, such as babies, retardates and psychotics, and if it could be done for everyone, much time and energy would be saved..."
-B.F. Skinner, Beyond Freedom and Dignity, 1971, page 66.

"And that," put in the Director sententiously, "that is the secret of happiness and virtue--liking what you've got to do. All conditioning aims at that: making people like their unescapable social destiny"
-Aldous Huxley, Brave New World, 1932, page 13.

"There is no subjugation so perfect as that which keeps the appearance of freedom, for in that way one captures volition itself."
-Jean-Jacques Rousseau, Emile

"The experience of the Holocaust brings into relief, however, another social mechanism; one with a much more sinister potential of involving in the the perpetration of the genocide a much wider number of people who never in the process face consciously either difficult moral choices or the need to stifle inner resistence of conscience. The struggle over moral issues never takes place, as the moral aspects of actions are not immediately obvious or are deliberately prevented from discovery and discussion. In other words, the moral character of action is either invisible or purposefully concealed.
To quote Hilberg again, 'It must be kept in mind that most of the participants [of genocide] did not fire rifles at Jewish children or pour gas into gas chambers...Most bureaucrats composed memoranda, drew up blueprints, talked on the telephone, and participated in conferences. They could destroy a whole people by sitting at their desk.'..The increase in the physical and/or psychic distance between the act and its consequences achieves more than the suspension of moral inhibition; it quashes the moral significance of the act and thereby pre-empts all conflict between personal standard of moral decency and immorality of the social consequences of the act. With most of the socially significant actions mediated by a long chain of complex causal and functional dependencies, moral dilemmas recede from sight, while the occasions for more scrutiny and conscious moral choice become increasingly rare."
-Zygmunt Bauman, Social production of moral invisibility; Modernity and the Holocaust, 1989, page 24-25.

Mordini, E., & Massari, S. (2008). Body, biometrics and identity. Bioethics, 22(9), 488-498.

Scholars speak of ‘informatization of the body’ to point out the digitalization of physical and behavioral attributes of a person and their distribution across the global information network.

According to a popular aphorism, biometrics are turning the human body into a passport or a password. As usual, aphorisms say more than they intend. Taking the dictum seriously, we would be two: our self and our body. Who are we, if we are not our body? And what is our body without us? Briefly, at the core of the notion of ‘informatization of the body’ there is a concern for the ways in which digitalization of physical features may affect the representation of ourselves and may produce processes of ‘disembodiment’. While privacy advocates and civil liberty organizations are concerned with the risk of function creep, philosophers are often concerned with informatization of the body, because it would touch our inner nature, the ‘human essence’.

Biometric systems digitalize physical appearances and behaviors in order to process them. The passage from analogical to digital representations is not a trivial one, because digital representations always imply a certain degree of simplification, which modifies the nature of the represented object. By digitalizing representations of body parts and behaviors, biometric technologies tends to remove from them all dimensions but those which are relevant to recognition. Ideally, biometrics aims to turn persons into mere living objects, which can be measured and matched with similar living objects.

The Italian philosopher Giorgio Agamben has argued that there are times when rulers create indistinct zones between human life and bare life. Agamben, following Carl Schmitt and Walter Benjamin, calls these times ‘states of exception’. In states of exception, humans are stripped of all meanings except the fact they have life, and that life, like the life of an animal, can be taken at any point without it being considered murder, as happened in the concentration camps.

The tattooing of concentration camp victims was rationalized as ‘the most normal and economic’ means of regulating large numbers of people. With this logic of utility applied during a similar state of exception in the United States today, the US-Visit’s bio-political tattooing enters a territory which ‘could well be the precursor to what we will be asked to accept later as the normal identity registration of a good citizen in the state’s gears and mechanisms’.

Agamben envisages the reduction to bare bodies for the whole humanity. For him, a new bio-political relationship between citizens and the state is turning citizens into pure biological life; and biometrics herald this new world.




 


Be the first to post a comment.



Currently Happening Presently Now: EUGENICS

3/27/14

Untitled 4275
Beckwith, J. (2001). On the social responsibility of scientists. ANNALI-ISTITUTO SUPERIORE DI SANITA, 37(2), 189-194.

The author outlines the history of genetics in the United States, looking at all the social and political implications of it, too often underestimated by the geneticists themselves. In contrast to physicists, who were forced to recognize the consequences of their role in the development of the atomic bomb and who openly carried a historical burden from their past, geneticists had no historical memory and were essentially ignorant of their own "atomic" history: the Eugenics movement in the first half of 20th century, which significantly affected social policy in the United State and Europe. Few geneticists, in fact, until recently, were aware of the Eugenics movement itself. It was only with the extreme misuse of genetics by German scientists and the Nazi Government that some English and US geneticists began to speak out more openly. The author sees in this lack of awareness the major responsibility of geneticists for the misrepresentation and misuse of science and also calls for a better interaction between scientists and those who work in other social fields; a communication gap between the two cultures holds dangers for us all.

Micklos, D., & Carlson, E. (2000). Engineering American society: the lesson of eugenics. Nature Reviews Genetics, 1(2), 153-158.

We stand at the threshold of a new century, with the whole human genome stretched out before us. Messages from science, the popular media, and the stock market suggest a world of seemingly limitless opportunities to improve human health and productivity. But at the turn of the last century, science and society faced a similar rush to exploit human genetics. The story of eugenics--humankind's first venture into a 'gene age'--holds a cautionary lesson for our current preoccupation with genes.

Gray, P. (1999). Cursed by eugenics. Time, 153(1), 84-85.

At a time when science promises such dazzling advances in the practice of medicine, it may be prudent to cast a glance over the shoulder, back to an earlier era when scientists--or people who thought they were doing science--stirred hopes that better days were only a generation or so away. The rise and fall of the theory known as eugenics is in every respect a cautionary tale. The early eugenicists were usually well- meaning and progressive types. They had imbibed their Darwin and decided that the process of natural selection would improve if it were guided by human intelligence. They did not know they were shaping a rationale for atrocities.

Lovgren, S. (2005). One-Fifth of Human Genes Have Been Patented. Study Reveals for National Geographic News.

Behar, D. M., Rosset, S., Blue-Smith, J., Balanovsky, O., Tzur, S., Comas, D., ... & Genographic Consortium. (2007). The Genographic Project public participation mitochondrial DNA database. PLoS Genetics, 3(6), e104.

The Genographic Project is studying the genetic signatures of ancient human migrations and creating an open-source research database. It allows members of the public to participate in a real-time anthropological genetics study by submitting personal samples for analysis and donating the genetic results to the database. We report our experience from the first 18 months of public participation in the Genographic Project, during which we have created the largest standardized human mitochondrial DNA (mtDNA) database ever collected, comprising 78,590 genotypes. Here, we detail our genotyping and quality assurance protocols including direct sequencing of the mtDNA HVS-I, genotyping of 22 coding-region SNPs, and a series of computational quality checks based on phylogenetic principles. This database is very informative with respect to mtDNA phylogeny and mutational dynamics, and its size allows us to develop a nearest neighbor–based methodology for mtDNA haplogroup prediction based on HVS-I motifs that is superior to classic rule-based approaches. We make available to the scientific community and general public two new resources: a periodically updated database comprising all data donated by participants, and the nearest neighbor haplogroup prediction tool.

Rifkin, J. (2005). Ultimate therapy: Commercial Eugenics in the 21st Century. Harvard International Review, 27(1), 44-49.

Presents information on the evolution and advancement of biological sciences in the twenty-first century, focusing on genetic engineering. How companies will benefit from these developments; Anticipations of molecular biologists; Benefits of genetic manipulation; Concerns over genetic stigmatization; Controversy surrounding human germ-line therapy.

"The new language of the information sciences has transformed many molecular biologists from scientists to engineers, although they are, no doubt, scarcely aware of the metamorphosis."

Darnovsky, M. (2002). The New Eugenics: The Case Against Genetically Modified Humans. Different Takes, 4.

"Promoting a future of genetically engineered inequality legitimizes the vast existing injustices that are socially arranged and enforced. Marketing the ability to specify our children's appearance and abilities encourages a grotesque consumerist mentality toward children and all human life. Fostering the notion that only a ”perfect baby” is worthy of life threatens our solidarity with and support for people with disabilities, and perpetuates standards of perfection set by a market system that caters to political, economic, and cultural elites. Channeling hopes for human betterment into preoccupation with genetic fixes shrinks our already withered commitments to improving social conditions and enriching cultural and community life."

Darnovsky, M. (2005). Human Rights in a Post-human Future. biopoliticaltimes.org

Most people are well aware that efforts to “improve the human gene pool” and “breed better people,” notoriously widespread from the end of the nineteenth century through the middle of the twentieth century, led to some of the most extreme violations of civil, political, and human rights in recent history. Nonetheless, five or six decades ago—before the structure of DNA had been deduced, before the modern environmental movement—most of the provisions of the Genetic Bill of Rights would have seemed nonsensical.

Even twenty-five years ago—before the development of genetic manipulation at the molecular level, legal doctrines that allow governments to grant patents on life, and DNA databases; before the advent and commercialization of in vitro fertilization and the screening of in vitro embryos; before the appearance of advertisements for social sex selection in mainstream U.S. publications—the document would have been widely considered an unwarranted over-reaction based on dystopian fantasy.

But here we are, at the beginning of the twenty-first century. Plants and animals are routinely genetically modified, patented, and brought to market by corporate enterprises. Genetic technologies are increasingly applied to human beings for forensic and medical purposes.

Nelkin, D., & Andrews, L. (1999). DNA identification and surveillance creep. Sociology of Health & Illness, 21(5), 689-706.

The use of DNA fingerprinting as a means of identification is expanding. The technology appeals to military, law enforcement, and other government authorities: those seeking evidence to establish the identity of a dead body, a missing person, a relative, or the perpetrator of a crime. The increased use of DNA identification and the development of DNA banking systems have intensified concerns about surveillance and privacy. More than just a source of identification, DNA databanks are also subject to abuse for political or economic ends. This article describes the expansion of mandatory genetic testing focusing on disputes that have occured when those required to provide DNA samples raise concerns about psychological harm and discrimination based on the information revealed by their DNA. We use these disputes to analyse the problems of ‘surveillance creep’ as growing numbers of people have their DNA on file.

Editorial. (1988). Life Industrialized. New York Times.

Life is special, and humans even more so, but biological machines are still machines that now can be altered, cloned and patented. The consequences will be profound, but taken a step at a time, they can be managed.



 


Be the first to post a comment.



Currently Happening Presently Now: POWER

3/25/14

Untitled 4271
"But in thinking of mechanisms of power, I am thinking rather of its capillary form of existence, the point where power reaches the very grain of individuals, touches their bodies and inserts itself into their actions and attitudes, their discourse, learning processes and everyday life."
-Michel Foucault, Power/Knowledge: Selected Interviews and other writings 1972-1977, Ed. Colin Gordon, Trans. Colin Gordon, et al., 1980, page 39.

"Fundamentally, the control of consciousness via the control of discourse is a vast generalized case of interiorized surveillance, what we earlier refered to as the discursive procedure of surveillance, or panopticism. Panopticism, here, is not merely a discursive procedure. It is a socio-political goal. Coercion, rather than operating by means of external pressures such as physical restraints, functions by means of the internalization of procedures of discourse within the consciousness of the 'controlled' subjects...With new communications technology, there is an internalization of such discursive procedures as means/end logic, a problem-solving conception of knowledge, and certain patterns of information and ordering. Once internalized, all of these procedures have a surveillance capacity over the subject's social interaction. Since such procedures characterize both discourses on and of new communications technology, it is not far-fetched to suggest that such an internalization is occuring in the 'information society'. In other words, the users of new communications technology have become the principal operators of their own subjection by assuming and interiorizing the discusive procedures on and of new communications technology.
Panopticism is a seminal concept for understanding the relationship of communications procedures to social control. It is far too costly and complicated for a State or an institution to maintain control over its citizens by means of external, physical pressure.
Panoptic discursive control has three advantages: (1) it allocates power at a very low cost; (2) it provides for maximum breadth of social control, especially as discourse is mass-diffused; (3) the output of discursive formations may be directly linked to the economic growth of power...new communications technology, by becoming ever more ubiquitous and more standardized, also congeals discourse into fixed, internalizable patterns of knowledge and, consequently, of social behavior."
-Marike Finlay, Powermatics: A discursive critique of new communications technology, 1987, page 177-78.

"Power treats human bodies as so many signs to be subjected to an incessant decoding process: its eye isolates, examines, judges, and corrects them, sometimes eradicates, even 'vaporizes' them when necessary...People who are identical to one another become alien to one another. They must also be expropriated from their own selves...The uprooted man is amnesiac, memory is forbidden him.
Power must thus become master of language since language is the living memory of man and offers him a space for inner resistance. Language constitutes a screen between the totalitarian gaze and the human body, it offers the shelter of its shadow, it veils the harsh light needed to read bodies. Language threatens the totalitarian enterprise. It is in fact the zone of obscurity where the gaze is lost. People must therefore be cured of their language: old and obscure terms must be eliminated, areas that escape definition, and zones of indetermination-ambiquity, equivocation, polysemy wiped out. Signs must be purged and purified of their meaning and bodies of their substance. And then they must be refilled...
The invention of Newspeak owes much to this ideal of absolute visibility (Panopticism),
and Orwell had rediscovered it in Ogden's minimal language...Ogden was one of the most authoritative interpreters of the linguistic developments in Bentham's thought. Basic English is merely its application. Besides, in 1929 Ogden had already conceived of a language even more abbreviated than Basic because it contained no more than 500 vocabulary words. Its name? Panoptic English..."
-Jean-Jacques Courtine, A Brave New Language: Orwell's Invention of Newspeak in 1984, Substance, vol. 15, no50, page 69-74.

Moats, L. (2012). Reconciling the common core state standards with reading research. Perspectives on Language and Literacy, 38(3), 15-18.

"It is not clear why 'Speaking and Listening' standards are separate from 'Language,' for example, until one realizes that the 'Language' standards pertain almost exclusively to written, not oral, language. The language standards at each grade level presume oral language competence and mastery of foundational reading and writing skills. There is no category for 'Writing Foundations' to parallel 'Reading Foundations' and thus the foundational skills of writing, including handwriting, spelling, punctuation, capitalization, grammar, usage, and sentence composition, are either overlooked, underestimated in importance, or awkwardly forced into other categories with no explicit link to composition. To further mask important skill domains affecting higher order learning, the reading foundations are placed after the text comprehension standards, implying that they are secondary to, incidental to, or less deserving of instructional emphasis than the literary and informational text reading standards. Although advancement of reading comprehension and engagement of students with high-quality texts is obviously a worthy goal, the educational path to that goal is by no means clear from the organization of the CCSS document."

"The U.S. military has long been at the forefront of research into applications of technology for the training of military personnel. Douglas Noble points out in his book The Classroom Arsenal that U.S. military expenditures on educational technology research far exceed what civilian agencies spend...When computer-based education was introduced into schools, much of the military mind-set came with it. Computers were seen as an efficient way for children to learn basic skills, using drill-and-practice programs that still account for a large proportion of school computer use. The military's need for swift information processing and decision-making lies behind the emphasis on using computer programs to develop problem-solving and decision-making skills-skills that are sometimes assessed purely in terms of successful computer use....And, as in society as a whole, the willingness to embrace computer technology in schools has consequences that extend well beyond the mere fact of its use....When children learn to use a computer, they are not just learning a skill. They are changing the relation between themselves and the world around them."
-Alison Armstrong and Charles Casement, The Child and the Machine: how computers put our children's education at risk, 2000, page 10-11.

"The computer is the totem of the post-industrial 'revolution' in education and computer literacy is the passport to its rewards. What is more astonishing even than the number of tracts about computer literacy is their conformity and predictability. Each chants a familiar litany about a new kind of learning that is individualised, student-centered, active, experiential and so on. We want to focus here on computer literacy only in the context of social control. In order to do this we shall begin by exploring the concepts of learning, knowledge and the mind which inform this messianic doctrine. The fundamental premise is that of cybernetics: the axiomatic belief that the human mind funtions like, or as, a computational machine.
This approach privileges rational procedures, goal-directed behaviour and cognitive structures. It emphasises that problem-solving skills entail solving problems through 'algorithmic thinking'...the development of computer skills is very much conceptualised within a process-oriented model for the curriculum which emphasises the development of information handling and problem-solving skills....The current obsession with skills, competences, process learning, and so on, is a further stage in the normalising 'scientific' discourse of cognitive development. Its achievement is a conception of the student or trainee as an information processing machine."
-Kevin Robins and Frank Webster, The Technical Fix: Education, Computers and Industry, 1989, page 225-26, 229.

"But the Panopticon was also a laboratory; it could be used as a machine to carry out experiments, to alter behavior, to train or correct individuals. To experiment with medicines and monitor their effects. To try out different punishments on prisoners, according to their crimes and character, and to seek the most effective ones. To teach different techniques simultaneously to the workers, to decide which is the best. To try out pedagogical experiments...one could bring up different children according to different systems of thought, making certain children believe that two and two do not make four or that the moon is a cheese....The Panopticon is a privileged place for experiments on men, and for analysing with complete certainty the transformations that may be obtained from them. The Panopticon may even provide an apparatus for supervising its own mechanisms....Thanks to its mechanisms of observation, it gains in efficiency and in the ability to penetrate into men's behavior; knowledge follows the advances of power, discovering new objects of knowledge over all the surfaces on which power is exercised."
-Michel Foucault, Discipline and Punish: The birth of the prison, 1979, page 203-04.

van der Ploeg, I. (2008). Machine-readable bodies: biometrics, informatization and surveillance. Identity, security and democracy. Lancaster, IOS Press, NATO Science Series, Amsterdam, 85-94.

This paper sets out to give a brief overview of the most compelling ethical and social implications of biometrics. It is based on several years of research funded by the Dutch organization for scientific research (NWO), and the EC funded Support Action Biometric Identification Technologies and Ethics (BITE). First, the issue of the status of biometric data is discussed, and second, it is argued that biometrics are an instance of the wider phenomenon of the contemporary redefinition of the body in terms of information, or the informatization of the body. In the third section, the implications of the arguments so far are drawn out by highlighting the ways in which biometric applications are caught in a series of paradoxes and tensions relating to identification, social categorization, surveillance, and democratic control.




 


Be the first to post a comment.



Currently Happening Presently Now: TECHNOLOGY

3/24/14

"Information technology and microelectronics are largely overlapping. Information technology englobes such different things as bookprint, reprography, the telephone network, broadcasting, the typewriter and the computer.... There is one case, however, where microelectronics is so interwoven with a particular information technology that they are almost identified in public opinion. It is precisely this case of computing, automated data processing, which is of outstanding interest in our context. Another area where microelectronics is very important is communications technology, including not only mass media but also interactive media like the telephone, or computer/communications networks....The present explosion of information technology and of microelectronics is much more closely related to the functioning of society as a whole than was the industrial revolution. To a much greater extent than other technologies, microelectronics affects the very essence of social cohesion, i.e. communication. Information and communication constitutute the fabric of society in more than a metaphorical sense. They do not remain unaffected when communication processses are mediated, channelled and partly taken over by technical devices."
-Klaus Lenk, Information Technology and Society, in Microelectronics and Society, Gunter Friedrichs, ed., 1982, page 273-74.

"But it is much later in the game now, and ignorance of the score is inexcusable. To be unaware that a technology comes equipped with a program for social change, to maintain that technology is neutral, to make the assumption that technology is always a friend to culture is, at this late hour, stupidity plain and simple...Introduce the alphabet to a culture and you change its cognitive habits, its social relations, its notion of community, history and religion. Introduce the printing press with moveable type, and you do the same. Introduce speed of light transmission of images and you make a cultural revolution. Without a vote. Without polemics. Without guerrilla resistance. Here is ideology, pure if not serene. Here is ideology without words, and all the more powerful in their absence. All that is required to make it stick is a population that devoutly believes in the inevitability of progress...
And the reason is that there has been no worthwhile discussion, let alone widespread public understanding, of what information is and how it gives direction to a culture. There is a certain poignancy in this, since there are no people who more frequently and enthusiastically use such phrases as 'the information age', 'the information explosion', and 'the information society'. We have apparently advanced to the point where we have grasped the idea that a change in the forms, volume, speed and context of information means something, but we have not got any further....To which I might add that questions about psychic, political and social effects of information are as applicable to the computer as to television. Although I believe the computer to be a vastly overrated technology , I mention it here because, clearly, Americans have accorded it their customary mindless inattention; which means they will use it as they are told, without a whimper. Thus, a central thesis of computer technology-that the principle difficulty we have in solving problems stems from insufficient data-will go unexamined. Until, years from now, when it will be noticed that the massive collection of speed of light retrieval of data have been of great value to large-scale organizations but have solved very little of importance to most people and have created at least as many problems as they may have solved."
-Neil Postman, Amusing Ourselves to Death, 1985, page 157-161.

Pollet, T. V., Roberts, S. G., & Dunbar, R. I. (2011). Use of social network sites and instant messaging does not lead to increased offline social network size, or to emotionally closer relationships with offline network members. Cyberpsychology, Behavior, and Social Networking, 14(4), 253-258.

The effect of Internet use on social relationships is still a matter of intense debate. This study examined the relationships between use of social media (instant messaging and social network sites), network size, and emotional closeness in a sample of 117 individuals aged 18 to 63 years old. Time spent using social media was associated with a larger number of online social network "friends." However, time spent using social media was not associated with larger offline networks, or feeling emotionally closer to offline network members. Further, those that used social media, as compared to non-users of social media, did not have larger offline networks, and were not emotionally closer to offline network members. These results highlight the importance of considering potential time and cognitive constraints on offline social networks when examining the impact of social media use on social relationships.

Kuss, D. J., & Griffiths, M. D. (2011). Online social networking and addiction—a review of the psychological literature. International journal of environmental research and public health, 8(9), 3528-3552.

Social Networking Sites (SNSs) are virtual communities where users can create individual public profiles, interact with real-life friends, and meet other people based on shared interests. They are seen as a 'global consumer phenomenon' with an exponential rise in usage within the last few years. Anecdotal case study evidence suggests that 'addiction' to social networks on the Internet may be a potential mental health problem for some users. However, the contemporary scientific literature addressing the addictive qualities of social networks on the Internet is scarce. Therefore, this literature review is intended to provide empirical and conceptual insight into the emerging phenomenon of addiction to SNSs by: (1) outlining SNS usage patterns, (2) examining motivations for SNS usage, (3) examining personalities of SNS users, (4) examining negative consequences of SNS usage, (5) exploring potential SNS addiction, and (6) exploring SNS addiction specificity and comorbidity. The findings indicate that SNSs are predominantly used for social purposes, mostly related to the maintenance of established offline networks. Moreover, extraverts appear to use social networking sites for social enhancement, whereas introverts use it for social compensation, each of which appears to be related to greater usage, as does low conscientiousness and high narcissism. Negative correlates of SNS usage include the decrease in real life social community participation and academic achievement, as well as relationship problems, each of which may be indicative of potential addiction.


 


Be the first to post a comment.



Currently Happening Presently Now

3/23/14

Untitled 4159

 


Be the first to post a comment.



Currently Happening Presently Now: CONFLICTS OF INTEREST

3/22/14

Untitled 4163
Bekelman, J. E., Li, Y., & Gross, C. P. (2003). Scope and impact of financial conflicts of interest in biomedical research: a systematic review. Jama, 289(4), 454-465.

Context Despite increasing awareness about the potential impact of financial conflicts of interest on biomedical research, no comprehensive synthesis of the body of evidence relating to financial conflicts of interest has been performed.

Objective To review original, quantitative studies on the extent, impact, and management of financial conflicts of interest in biomedical research.

Data Sources Studies were identified by searching MEDLINE (January 1980-October 2002), the Web of Science citation database, references of articles, letters, commentaries, editorials, and books and by contacting experts.

Study Selection All English-language studies containing original, quantitative data on financial relationships among industry, scientific investigators, and academic institutions were included. A total of 1664 citations were screened, 144 potentially eligible full articles were retrieved, and 37 studies met our inclusion criteria.

Data Extraction One investigator (J.E.B.) extracted data from each of the 37 studies. The main outcomes were the prevalence of specific types of industry relationships, the relation between industry sponsorship and study outcome or investigator behavior, and the process for disclosure, review, and management of financial conflicts of interest.

Data Synthesis Approximately one fourth of investigators have industry affiliations, and roughly two thirds of academic institutions hold equity in start-ups that sponsor research performed at the same institutions. Eight articles, which together evaluated 1140 original studies, assessed the relation between industry sponsorship and outcome in original research. Aggregating the results of these articles showed a statistically significant association between industry sponsorship and pro-industry conclusions (pooled Mantel-Haenszel odds ratio, 3.60; 95% confidence interval, 2.63-4.91). Industry sponsorship was also associated with restrictions on publication and data sharing. The approach to managing financial conflicts varied substantially across academic institutions and peer-reviewed journals.

Conclusions Financial relationships among industry, scientific investigators, and academic institutions are widespread. Conflicts of interest arising from these ties can influence biomedical research in important ways.


"This comprehensive review of the literature confirms that financial relationships among industry, scientific investigators, and academic institutions are pervasive. About one fourth of biomedical investigators at academic institutions receive research funding from industry. One study reported that lead authors in 1 of every 3 articles published hold relevant financial interests, while another reported that approximately two thirds of academic institutions hold equity in 'start-up' businesses that sponsor research performed by their faculty."

"Despite the prevalence of these relationships and the broad concerns they have generated, a relative paucity of data has been published describing the impact of financial ties on biomedical research. Although only 37 articles met inclusion criteria, evidence suggests that the financial ties that intertwine industry, investigators, and academic institutions can influence the research process. Strong and consistent evidence shows that industry-sponsored research tends to draw pro-industry conclusions. By combining data from articles examining 1140 studies, we found that industry-sponsored studies were significantly more likely to reach conclusions that were favorable to the sponsor than were nonindustry studies."


Perlis, R. H., Perlis, C. S., Wu, Y., Hwang, C., Joseph, M., & Nierenberg, A. A. (2005). Industry sponsorship and financial conflict of interest in the reporting of clinical trials in psychiatry. American Journal of Psychiatry, 162(10), 1957-1960.

OBJECTIVE: Financial conflict of interest has been reported to be prevalent in clinical trials in general medicine and associated with a greater likelihood of reporting results favorable to the intervention being studied. The extent and implications of industry sponsorship and financial conflict of interest in psychiatric clinical trials have not been investigated, to the authors’ knowledge. METHOD: The authors examined funding source and author financial conflict of interest in all clinical trials published in the American Journal of Psychiatry, the Archives of General Psychiatry, the Journal of Clinical Psychopharmacology, and the Journal of Clinical Psychiatry between 2001 and 2003. RESULTS: Among 397 clinical trials identified, 239 (60%) reported receiving funding from a pharmaceutical company or other interested party, and 187 studies (47%) included at least one author with a reported financial conflict of interest. Among the 162 randomized, double-blind, placebo-controlled studies examined, those that reported conflict of interest were 4.9 times more likely to report positive results; this association was significant only among the subset of pharmaceutical industry-funded studies. CONCLUSIONS: Author conflict of interest appears to be prevalent among psychiatric clinical trials and to be associated with a greater likelihood of reporting a drug to be superior to placebo.

Glode, E. R. (2002). Advising Under the Influence: Conflicts of Interest Among FDA Advisory Committee Members. Food & Drug LJ, 57, 293.

Advisers on Vaccines Often Have Conflicts, Report Says; by Gardiner Harris, New York Times: New York Edition, December 18, 2009, page A28.

Representative Rosa DeLauro, a Connecticut Democrat who said she had long been a supporter of the C.D.C., said: “That is why I am so concerned about this report issued by the inspector general exposing serious ethics violations within the C.D.C. All members of the federal advisory committees, whose recommendations direct federal policy, should be without conflict of interest.”

OIG Final Report: CDC's Ethics Program for Special Government Employees on Federal Advisory Committees, OEI-04-07-00260, Daniel R. Levinson, Inspector General, Department of Health & Human Services, December 2009.

DeLong, G. (2012). Conflicts of interest in vaccine safety research. Accountability in research, 19(2), 65-88.

Conflicts of interest (COIs) cloud vaccine safety research. Sponsors of research have competing interests that may impede the objective study of vaccine side effects. Vaccine manufacturers, health officials, and medical journals may have financial and bureaucratic reasons for not wanting to acknowledge the risks of vaccines. Conversely, some advocacy groups may have legislative and financial reasons to sponsor research that finds risks in vaccines. Using the vaccine-autism debate as an illustration, this article details the conflicts of interest each of these groups faces, outlines the current state of vaccine safety research, and suggests remedies to address COIs. Minimizing COIs in vaccine safety research could reduce research bias and restore greater trust in the vaccine program.

Sterne, J. A., Egger, M., & Smith, G. D. (2001). Investigating and dealing with publication and other biases in meta-analysis. Bmj, 323(7304), 101-105.

Studies that show a significant effect of treatment are more likely to be published, be published in English, be cited by other authors, and produce multiple publications than other studies. Such studies are therefore also more likely to be identified and included in systematic reviews, which may introduce bias. Low methodological quality of studies included in a systematic review is another important source of bias.

All these biases are more likely to affect small studies than large ones. The smaller a study the larger the treatment effect necessary for the results to be significant. The greater investment of time and money in larger studies means that they are more likely to be of high methodological quality and published even if their results are negative. Bias in a systematic review may therefore become evident through an association between the size of the treatment effect and study size—such associations may be examined both graphically and statistically.


Bastian, H. (2006). ‘They would say that, wouldn't they?’A reader's guide to author and sponsor biases in clinical research. Journal of the Royal Society of Medicine, 99(12), 611-614.

Lee, K., Bacchetti, P., & Sim, I. (2008). Publication of clinical trials supporting successful new drug applications: a literature analysis. PLoS medicine, 5(9), e191.

Over half of all supporting trials for FDA-approved drugs remained unpublished ≥ 5 y after approval. Pivotal trials and trials with statistically significant results and larger sample sizes are more likely to be published. Selective reporting of trial results exists for commonly marketed drugs. Our data provide a baseline for evaluating publication bias as the new FDA Amendments Act comes into force mandating basic results reporting of clinical trials.

Wager, E. (2007). Authors, ghosts, damned lies, and statisticians. PLoS medicine, 4(1).

Since the earliest peer-reviewed publications of the late 17th century, conventions about the authorship of scientific papers—which were generally anonymous and attributed to the sponsor (in those days, usually the church or the king)—have evolved considerably. Readers now want to know not only who paid for the research but also who did the work. Transparency (i.e., full disclosure) is now considered a moral responsibility, and many medical journals have introduced mechanisms for increasing transparency. The International Committee of Medical Journal Editors (ICMJE) has also issued guidance on who qualifies for authorship, and their criteria have been updated and augmented several times in response to several authorship scandals. Yet problems with authorship persist.

Ioannidis, J. P. (2005). Why most published research findings are false. PLoS medicine, 2(8), e124.

There is increasing concern that most current published research findings are false. The probability that a research claim is true may depend on study power and bias, the number of other studies on the same question, and, importantly, the ratio of true to no relationships among the relationships probed in each scientific field. In this framework, a research finding is less likely to be true when the studies conducted in a field are smaller; when effect sizes are smaller; when there is a greater number and lesser preselection of tested relationships; where there is greater flexibility in designs, definitions, outcomes, and analytical modes; when there is greater financial and other interest and prejudice; and when more teams are involved in a scientific field in chase of statistical significance. Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true. Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias. In this essay, I discuss the implications of these problems for the conduct and interpretation of research.

Taubes, G., & Mann, C. C. (1995). Epidemiology faces its limits. Science, 269(5221), 164-169.

Topol, E. J. (2004). Failing the public health—rofecoxib, Merck, and the FDA. New England Journal of Medicine, 351(17), 1707-1709.

Unfortunately, such a trial was never done. The FDA has the authority to mandate that a trial be conducted, but it never took the initiative. Instead of conducting such a trial at any point — and especially after the FDA advisory committee meeting in 2001 — Merck issued a relentless series of publications, beginning with a press release on May 22, 2001, entitled “Merck Reconfirms Favorable Cardiovascular Safety of Vioxx” and complemented by numerous papers in peer-reviewed medical literature by Merck employees and their consultants. The company sponsored countless continuing medical “education” symposiums at national meetings in an effort to debunk the concern about adverse cardiovascular effects.

Each time a study was presented or published, there was a predictable and repetitive response from Merck, which claimed that the study was flawed and that only randomized, controlled trials were suitable for determining whether there was any risk. But if Merck would not initiate an appropriate trial and the FDA did not ask them to do so, how would the truth ever be known?

Horrobin, D. F. (2001). Something rotten at the core of science?. Trends in Pharmacological Sciences, 22(2), 51-52.

The US Supreme Court has recently been wrestling with the issues of the acceptability and reliability of scientific evidence. In its judgement in the case of Daubert versus Merrell Dow, the Court attempted to set guidelines for US judges to follow when listening to scientific experts. Whether or not findings had been published in a peer-reviewed journal provided one important criterion. But in a key caveat, the Court emphasized that peer review might sometimes be flawed and therefore this criterion was not unequivocal evidence of validity or otherwise. A recent analysis of peer review adds to this controversy by identifying an alarming lack of correlation between reviewers' recommendations.




 


Be the first to post a comment.



Currently Happening Presently Now: ELECTROMAGNETIC FIELDS

3/21/14

Untitled 4155
Microwaving Our Planet:
The Environmental Impact of the Wireless Revolution

ARTHUR FIRSTENBERG 1997
Published by the Cellular Phone Taskforce

"From Bill Gates' planned fleet of 300 satellites to the millions of ground based antennas being constructed through-out the world, our privacy is being invaded, our health undermined, our water polluted, endangered species threatened, the ozone layer destroyed, and our climate altered. The assault has already begun.

The purpose of this report is to give a general overview of the environmental threats associated with the wireless revolution, and an in-depth review of 70 years of research into the health hazards of microwaves.

The lack of an adequate review of the literature until now has led to the incorrect perception that the scientific evidence is contradictory and inconclusive. In fact the scientific evidence is consistent and overwhelming."

"Until recently almost all radio transmitters have been fixed and their range limited. The addition of more broadcast channels and new types of communication devices did not change that. But with the advent of cellular technology, all limits have been lifted. Telephones are no longer just communicators but also transmitters, and they are mobile. Suddenly every human being is a potential source of radiation. Suddenly electronic communication is a human right. Suddenly fixed transmitters and satellites are being built to accommodate mobile human beings, rather than the other way around.

Electromagnetic pollution will no longer remain concentrated in population centers, nor will radio transmitters be confined any longer to non-residential zones. In the space of a year or two, unless the people put a stop to it, this form of pollution will be spread more or less evenly over every square inch of the world."

"We cannot expect to increase the irradiation of the entire earth 1000-fold or more virtually overnight without health effects and without massive biological consequences. Indeed this technology is more invasive than virtually any other and has the potential of causing worldwide catastrophe."

"Electromagnetic Fields (EMF): Killing Fields" by A. Firstenberg, The Ecologist v.34, n.5, 1 jun 2004.

Sage, C., & Carpenter, D. O. (2009). Public health implications of wireless technologies. Pathophysiology, 16(2), 233-246.

Global exposures to emerging wireless technologies from applications including mobile phones, cordless phones, DECT phones, WI-FI, WLAN, WiMAX, wireless internet, baby monitors, and others may present serious public health consequences. Evidence supporting a public health risk is documented in the BioInitiative Report. New, biologically based public exposure standards for chronic exposure to low-intensity exposures are warranted. Existing safety standards are obsolete because they are based solely on thermal effects from acute exposures. The rapidly expanding development of new wireless technologies and the long latency for the development of such serious diseases as brain cancers means that failure to take immediate action to reduce risks may result in an epidemic of potentially fatal diseases in the future. Regardless of whether or not the associations are causal, the strengths of the associations are sufficiently strong that in the opinion of the authors, taking action to reduce exposures is imperative, especially for the fetus and children. Such action is fully compatible with the precautionary principle, as enunciated by the Rio Declaration, the European Constitution Principle on Health (Section 3.1) and the European Union Treaties Article 174.


"Human beings are bioelectrical systems. Our hearts and brains are regulated by internal bioelectrical signals. Environmental exposures to artificial EMFs can interact with fundamental biological processes in the human body. In some cases, this may cause discomfort, or sleep disruption, or loss of wellbeing (impaired mental functioning and impaired metabolism) or sometimes, maybe it is a dread disease like cancer or Alzheimer’s disease. It may be interfering with ones’ ability to become pregnant, or carry a child to full term, or result in brain development changes that are bad for the child. It may be these exposures play a role in causing long-term impairments to normal growth and development of children, tipping the scales away from becoming productive adults. We have good evidence these exposures can damage our health, or that of children of the future who will be born to parents now immersed in wireless exposures.

In the United States, the deployment of wireless infrastructure (cell tower sites) to support cell phone use has accelerated greatly in the last decades. The spread of cell towers in communities, often placed on pre-school, church day-care, and school campuses means that young children can have thousands of times higher RF exposures in home and school environments than existed even 20-25 years ago. CTIA estimates that in 1997 there were only 36,650 cell sites in the US; but increased rapidly to 131,350 in June 2002; 210,350 in June 2007 and 265,561 in June 2012 (CTIA, 2012). About 220,500 cell sites existed in 2008. These wireless antennas for cellular phone voice and data transmission produce whole-body RFR exposures over broad areas in communities that are an involuntary and unavoidable source of radiofrequency radiation exposure. Further, the nearly universal switch to cordless and cell phones, and away from corded landline phones means close and repetitive exposures to both EMF and RFR in the home. Other new RFR exposures that didn’t exist before come from WI-FI access points (hotspots) that radiate 24/7 in cafes, stores, libraries, classrooms, on buses and trains, and from personal WI-FI enabled devices (iPads, tablets, PDAs, etc). The largest single source of community-wide, pervasive RFR yet rolled out is the ‘smart meter’ infrastructure. This program places a wireless device (like a mini-mobile phone base station) on the wall, replacing the electromechanical (spinning dial) meter. They are to be installed on every home and classroom (every building with an electric meter). Utilities from California to Maine have installed tens of millions already, despite health concerns of experts and enormous public resistance. The wireless meters produce spikes of pulsed radiofrequency radiation 24/7, and in typical operation, will saturates living space at levels that can be much higher than already reported to cause bioeffects and adverse health effects (utilities can only say they are compliant with outdated federal safety standards, which may or may not always be true – see http://sagereports.com/smart-meter-rf). These meters, depending on where they are placed relative to occupied space in the home or classroom, can produce RFR exposure levels similar to that within the first 100 feet to 600 feet of a mobile phone base station (cell tower).

The cumulative RFR burden within any community is largely unknown. Both involuntary sources (like cell towers, smart meters and second-hand radiation from the use of wireless devices by others) plus voluntary exposures from ones’ personal use of cell and cordless phones, wireless routers, electronic baby surveillance monitors, wireless security systems, wireless hearing aids, and wireless medical devices like implanted insulin pumps all add up. No one is tallying up the combined exposure levels. Billions of new RFR transmitters from the smart meter rollout alone will raise the baseline RFR levels, and will significantly add to the existing RFR background.

Sometimes, science does not keep pace with new environmental exposures that are by-products of useful things we want to buy and use in society. So, the deployment runs ahead of knowledge of health risks. It is an old story. This is the case for EMF and RFR, and this Report underscores the critical need to face difficult questions, make mid-course corrections, and try to repair the damage already done in this generation, and to think about protecting future generations."
-Why we care: THE BIOINITIATIVE REPORT 2012: A Rationale for Biologically-based Public Exposure Standards for Electromagnetic Fields (ELF and RF)

Bellieni, C. V., Unit, N. I. C., & Pinto, I. (2012). Fetal and Neonatal Effects of EMF.

The exposure of the developing fetus and of children to electromagnetic fields (EMF) including both radiofrequency radiation (RF) used in new wireless technologies, and to extremely low frequency or power frequency fields (ELF-EMF) has raised public health concerns because of
the possible effects (cancer, neurological effects, developmental disability effects, etc) from the long-term exposure to low-intensity, environmental level fields in daily life. This chapter documents some studies on RF and ELF-EMF that report bioeffects and adverse health impacts to the fetus, and young child where exposure levels are still well within the current legal limits of many nations.

Several studies report adverse health effects at levels below safety standards; the evidence to date suggests that special attention should be devoted to the protection of embryos, fetuses and newborns who can be exposed to many diverse frequencies and intensities of EMF throughout their lifetimes, where the health and wellness consequences on these subjects are still scarcely explored.



 


Be the first to post a comment.

Previously published:

All 73 blog entries

Principiis Obsta (et respice finem)


RSS |