I
believe that as scientists we have to be able to admit that we are capable of
making mistakes. As scientists if we “hit a wall” in our progress we must feel
free to say so or to ask for help when it is needed from other scientists in
the community. I think that beginning at
this point is important and by continuing to seek the review of other
scientists in our areas we will begin to minimize these mistakes. Earlier in the semester we read about Alfred
Binet and his initial research into craniometry in Stephen Jay Gould’s Mismeasure of Man. When he felt that he was falling victim to
confirmation bias he had his results double checked by his assistant and found
that he was indeed confirming his own a priori beliefs. This scientist knew almost a century ago what
we are still struggling to figure out; if we cannot intrinsically be honest
then we must have an external system of checks.
Even
with an elaborate system of checks errors can be made which could result in the
arresting of scientific progress or worse the loss of human life. We
must allow that dollars will stop running the minds of scientists and sense
will begin to be the prevailing ideal. With
the ability to say, “I am wrong, I was wrong, this is wrong.” We give our
scientists the same rights as we would give to anyone else and yet hold them to
also say, “I will keep working to figure it out!” Professor Oyen made an excellent point
yesterday at the end of class when she said that even the data entry assistant
in the research laboratory is curing cancer.
Sometimes it is the proximity to the change which can influence the
actions of the agents. It is ideas like
these which change the facets of numbers into the faces of patients and if we
could implement a system similar to this then we might see a positive change in
medical research.
Finally,
we must as a society be able to heel our excitement of an advance and fully
explore all of the implications of its administration. Without this full review of scientific
advancement we as a society will continue to see mistakes taken too far and
lives lost which otherwise could have been saved. Coupling this type of a system with an
increased degree of transparency in scientific reporting we could see a reduction
in these Type I errors which although not prevalent occur at an incidence rate
which is alarming. This transparency could possibly translate into a greater system of scientific education for the public about research techniques and the implications of committing an ecological fallacy (i.e. Wakefield, Potti, McNamara, Matsubara, etc.).
I completely agree that transparency is especially important in science (in every field)! Yet research communities are not told to do so because of the various incentives, and businesses and corporations also are not transparent because of profit. It is interesting how we will be the first ones to say "we want more transparency," when we are also the first ones to sue or become viciously angry if there is any type of mistake. We tell ourselves that "No, there is no room for error," when there HAS to be room for error. We are not robots, already programmed for becoming trapped in mistakes. We are also not robots, always doing the right things for the right reasons. And that is why it is always better to know more about a study or analysis, than not know enough. So we definitely need to set higher goals for research communities, as well as for businesses to become transparent. It is the only way for mistakes to not be shunned, but to become a part of the solution.
ReplyDelete