View unanswered posts | View active topics It is currently Sat Apr 27, 2024 7:50 am



Reply to topic  [ 63 posts ]  Go to page Previous  1, 2, 3, 4, 5  Next
 Discussion - Scientific progression and morals 
Author Message
User avatar

Joined: Wed Jan 07, 2009 10:26 am
Posts: 4074
Location: That quaint little British colony down south
Reply with quote
Post Re: Discussion - Scientific progression and morals
Culthero wrote:
When I gave that arbitrary 20 I was trying to suggest that I'mnot contributing on the suffering of generations of animals in labs and we couldn't possibly surpass animal testers in agonising those animals.
You evidently are not aware of the incredibly specific and strict sets of guidelines involved in animal testing which you would have to follow to get any appreciable funding.
Culthero wrote:
I gave those fictional examples because the line between science fiction and actual discoveries is usually a blurry and short one. Something you watched ten years ago in a movie might have as well been produced today which means that you have a lot to fear.
Again, you say that since it was in a movie once means that it is possible and therefore something to fear. There is no basis in evidence for this point of view apart from arbitrary fear.
Culthero wrote:
Because as long as science and scientific discoveries offer less than they actually solve and reach less people than they should actually reach, they make more harm than good.
Scientific discoveries offer nothing; applications offer things. You don't seem to understand the way in which scientific advance interacts with actual application.
Culthero wrote:
You said that research for nukes and nuclear plants go hand in hand. Now that we have another eco disaster like Chernobyl at hand in Japan has it ever occured to you that neither research should have been conducted in the first place?
Chernobyl was in Russia, and came as a result of extremely poor engineering practices. The events in Japan - that would be: the bombings of Nagasaki and Hiroshima came as a result of militaristic applications of nuclear power.
Culthero wrote:
Harnessing the power of the sun here on earth would of course have a catch, wouldn't it? Scientists have been able to create energy, yes. But were they also able to keep it clean? If the consequences, unforeseen or not, equals to ruined lives and environment, is the price of progress worth it?
'Harnessing the power of the sun' has as much of a catch as harnessing the power of a heavy bit at the end of a stick. Its ability to maintain peace by preventing conflict over resources is much greater than its ability to produce damage in wars. The same goes for nuclear power. The end-effects of nuclear science have saved many more lives than they have destroyed.
Culthero wrote:
As for the concept of regulation; Let's say that scientists are kept under watch by other scientists. Say there is a global scientific research ethics council which consists of elder scientists who do not actively participate in research and who do not do business with any government and who accepts or refuses research proposals. The council would be an entity above all governments and would approve of the projects they deem worthy of being benefitable to the humankind. No polticians, military, or corporations/profiteers allowed on the decision to carry out the research or not. I would've trusted science then. But we are a long way from even that happening. Unless science guides us to more environment friendly ways of living, we or the ecosystem pays for our comforts.
A large amount of modern science is directed solely at making older methods sustainable and ecologically friendly. You haven't given a well-defined reason why this council would be better than the current system.
Culthero wrote:
A side note is that we should be able to "feel" some things in our way. In our era data and numbers can always be falsified and outdated, and superseded, and obsoloted, and discarded, or deemed right or wrong by opposing parties. We have a constant flow of information but no way to filter it or navigate through it aside from how we feel about it. However, that will not cut it for the scientists because we are not the ones who can go to work and create the next viral strain or discover a new way to affect and control the masses. Scientists should be able to filter the information they have obtained in their own light and feeling; not for the money, power, or glory but for the people of their Earth.
You seem to think this isn't predominantly the case. You also seem to think that scientists sit away in shadowy labs making mind-control devices and bio-weapons. You don't seem to have actually made a point in that last sentence.
Culthero wrote:
I hope my examples and analogies do not overshadow the main point I am trying to make. Basically good science in the hands of bad people has a lot more cons than pros which we usually cannot enjoy
You are defining scientific knowledge as 'good' and 'bad'. There are a number of reasons this is a bad idea. Firstly, scientific knowledge is neutral; its application may be beneficial to society or detrimental to it. Applications are, generally speaking, not wholly beneficial or detrimental. Benefit and detriment are basically never applicable to all of society.
Culthero wrote:
- Progress for making nukes = bad
- Progress for building nuclear reactors = bad
Did you know that the increase in environmental radiation as a result of a coal-fired power plant is several times higher than that of a fission power plant? Did you know that a fusion power plant produces even less radiation and no radioactive waste products? Did you know that the research that would allow us to produce incredible amounts of clean energy from specific isotopes of water is the same research that gives us the fusion bomb and fission bomb? Did you know that in a theoretical scenario wherein huge amounts of research go into both, the one of the predominant cause of weapon deployment (resource disputes) would be lessened to an incredible degree?
Culthero wrote:
- Progress for more Toyotas and hamburgers = bad
- Progress for the disabled = good
These two points are the same thing. The engineering that makes it possible for the disabled to lead normal lives enable the production of cars and a large amount of other machinery. They also allow human life to be viable outside of very specific population centres, often via the availability of food.
Culthero wrote:
- Progress for extinct species = good
Any reason why? Extinct species reintroduction would likely effect modern ecosystems in ways which are detrimental to both current species and humanity.
Culthero wrote:
- Progress for better drones = bad
Drones can be used in war, true. War drones generally don't need to be much more than something that can fly, see and use missiles. We already have that, so much modern drone research is disaster aid, like earthquake survivor assistance, building collapse management, flood management, etc.
Culthero wrote:
- Progress for brainwashing = bad
The same progress that allows this allows for people with serious mental illness to live normal, fulfilling lives.
Culthero wrote:
- Progress for money/power/glory = bad
Motive is more important than end-effect? If the man that cured cancer wanted to do so because he wanted recognition it would, in your opinion, be a wholly bad thing?
Culthero wrote:
- Progress for the people/Earth = good
You have quite clearly shown that your vision of progress for people/earth happens to leave out large swathes of people and earth.
Culthero wrote:
If science can not be released from the clutches of corporations, military, or the politicians = bad.
The clutches? You seem to have a view of scientific research derived from science fiction and action films.


Mon Mar 14, 2011 12:21 pm
Profile WWW
User avatar

Joined: Thu Mar 04, 2010 9:07 pm
Posts: 126
Location: Turkey
Reply with quote
Post Re: Discussion - Scientific progression and morals
Well I might be wrong in using "science/scientific" instead of "their applications" and a lot of my views might be derived from science fiction. But a lot of my thoughts are also affected by tragedy and drama. I am saying that Dionysus is as godly as Apollo, Prometheus as divine as Zeus (you may look up and find out that they are significant symbols on this subject). I am saying that seeing/hearing does not equal to understanding. I'm saying that motives are, of course, more important than the ends. I think they are the single most important thing in making progress. If we are so smart as to make science as opposed to being territorial animals, who after all those thousands of years are as yet in pursuit of resources, then I am afraid to say that we have failed to fulfill the true purpose of making progress. For the purpose of science should not be the provision of the excesses of mankind but giving it what it lacks to survive. But in the light of what we make to the world I will have to agree that dire times require dire measures, and so in order to live in the "technologically advanced" world that we have created, we will need more unbound and liberal thinking, even if it means some sacrifices will have to be made.

I am not saying that there should be no scientific progression whatsoever. I'm just saying that it should be watched by people who do the same job of making our lives easier and better by inventing new things and not by those who fund it in order to get more profit, weapons, or power. Funds and other bonds between inventors and the possible benefactor usually affects the possible favorable applications of the invention in an adverse way as we have seen with the bombs, A.I. research into unmanned vehicles, bioweapons, etc. Gifting technologies with weapon applications to a few people is a rather stupid idea. The arms race is not good for our survival if it means making weapons that are capable of eradicating another race, or mutual destruction. But an evolutionary arms race wherein sides make themselves better and protect their environment for further competition and rivalry into progressing toward a higher stage of mutual existence can be a better solution.

When I say "no drones" I mean bomb releasing ones. When I say "extinct animals" I mean the ones that are or will be extinct because of us. I am sorry for the time you spent pinning down my sentences and pouring solvent on them if I am misunderstood the first time.


Mon Mar 14, 2011 1:53 pm
Profile
happy carebear mom
User avatar

Joined: Tue Mar 04, 2008 1:40 am
Posts: 7096
Location: b8bbd5
Reply with quote
Post Re: Discussion - Scientific progression and morals
Culthero wrote:
Well I might be wrong in using "science/scientific" instead of "their applications" and a lot of my views might be derived from science fiction. But a lot of my thoughts are also affected by tragedy and drama.

Science fiction is just that, fiction. There are far more benefits to society to be found along the path of knowledge than there are on the path of fear of knowledge.


Mon Mar 14, 2011 2:07 pm
Profile
User avatar

Joined: Wed Jan 07, 2009 10:26 am
Posts: 4074
Location: That quaint little British colony down south
Reply with quote
Post Re: Discussion - Scientific progression and morals
Culthero wrote:
a lot of my thoughts are also affected by tragedy and drama
So are most people's. I don't see how that's relevant.
Culthero wrote:
I am saying that Dionysus is as godly as Apollo, Prometheus as divine as Zeus
Depending on interpretation, Promethius may have been considered more divine, being a generation up from Zeus. Using them solely as symbols for scientific progress seems to be disrespecting the wide spectrum of things each represented, but whatever; you haven't actually decided to define what you believe them to represent, so it's a non-issue.
Culthero wrote:
I am saying that seeing/hearing does not equal to understanding.
Well it's a good thing science is based around interpretation rather than mere observation.
Culthero wrote:
I'm saying that motives are, of course, more important than the ends. I think they are the single most important thing in making progress.
But that's where the issue arises. Why is the motive more important than the result if you have decided elsewhere that the pure results are what matters? It strikes me as an inconsistency in your argument.
Culthero wrote:
If we are so smart as to make science as opposed to being territorial animals, who after all those thousands of years are as yet in pursuit of resources then I am afraid to say that we have failed to fulfill the true purpose of making progress. For the purpose of science should not be the provision of the excesses of mankind but giving it what it lacks to survive.
Excesses of mankind? People still starve. Researching the structure of the atom gives us nuclear bombs, but also ways to cheapen energy. Researching the mechanics of an engine gives us the planes that drop the bombs, but also the means to transport the food that people need. If progress is merely to subsist mankind, then we should, in your opinion, have abolished it long ago. People can easily remain on this planet. By your metric we should have stopped science at the dark ages. Medicine, mental health care and the so-called 'excesses' which let our infirm, disabled and elderly lead comfortable lives are unnecessary in your vision of science.
Culthero wrote:
But in the light of what we make to the world I will have to agree that dire times require dire measures, and so in order to live in the "technologically advanced" world that we have created, we will need more unbound and liberal thinking, even if it means some sacrifices will have to be made.
How have we created the world? People arose naturally from the world and no external force urged them to produce the modern world. At what point do you decide the world has been created by man? Anything we do is natural by virtue of the fact that we ourselves are natural.
Culthero wrote:
I am not saying that there should be no scientific progression whatsoever.
This, again, strikes me as inconsistent to the rest of your argument
Culthero wrote:
I'm just saying that it should be watched by people who do the same job of making our lives easier and better by inventing new things and not by those who fund it in order to get more profit, weapons, or power.
You seem to be thinking of development, not research. Research uncovers the knowledge which development applies. Even without organisations whose interests specifically lie within such progress, the principles remain to be exploited. If you stop any research that could result in 'bad' applications, however arbitrary that designation is, you will finish cutting almost all progress you are given access to.
Culthero wrote:
Funds and other bonds between inventors and the possible benefactor usually affects the possible favorable applications of the invention in an adverse way as we have seen with the bombs, A.I. research into unmanned vehicles, bioweapons, etc.
You don't seem to have very much knowledge about how any of the examples you listed came to be.
Culthero wrote:
Gifting technologies with weapon applications to a few people is a rather stupid idea. The arms race is not good for our survival if it means making weapons that are capable of eradicating another race, or mutual destruction.
There is a good reason the nuclear bomb has yet to be used extensively and most countries are trying to cut down its stockpiles. It is an inefficient weapon. The life expectancy of the places where the bombs dropped on Hiroshima and Nagasaki are lower than the life expectancy of where the bombs hit. The nuclear bomb is a monument to the owner's power, but if you can make 100 smaller bombs for half the price that will hit every target you would have wanted to hit with the nuclear weapon then you will use the smaller bombs. Modern military powers are working towards dynamic and fluid forces.
Culthero wrote:
But an evolutionary arms race wherein sides make themselves better and protect their environment for further competition and rivalry into progressing toward a higher stage of mutual existence can be a better solution.
How is this a better solution? If China revealed cold fusion reactors tomorrow and decided to keep them for their economic advantage, I can say with high certainty that the resulting wars would have been much bloodier and drawn out than if they revealed a stockpile of nuclear weapons as large as America's.
Culthero wrote:
When I say "no drones" I mean bomb releasing ones. When I say "extinct animals" I mean the ones that are or will be extinct because of us.
I know you mean bomb releasing drones. The problem is that you ignore that when drones were developed they were developed as just drones. Some were developed into things that take life, some were developed into things that make life easier and the rest were and are being developed into things that save lives. I am completely certain that the number of lives saved by drones will be far greater than the number of lives lost. I meant extinct animals too. Again you are making special distinction between animals that became extinct of 'natural' causes and because of humans. Humans are as much a natural evolutionary pressure as seasonal change, if a fairly harsh one. A large number of extinct animals, if brought back, would be doomed to suffer an existence barely sustained and almost entirely reliant on humans.
Culthero wrote:
I am sorry for the time you spent pinning down my sentences and pouring solvent on them if I am misunderstood the first time.
I am sorry for the large amount of misunderstanding you seem to hold in relation to science and technology.


Mon Mar 14, 2011 2:46 pm
Profile WWW
User avatar

Joined: Thu Mar 04, 2010 9:07 pm
Posts: 126
Location: Turkey
Reply with quote
Post Re: Discussion - Scientific progression and morals
I have to go to work now.
http://www.youtube.com/watch?v=9W5Am-a_xWw
But I will be back.
http://www.youtube.com/watch?v=AsEqBQAcYVw


Mon Mar 14, 2011 4:47 pm
Profile
happy carebear mom
User avatar

Joined: Tue Mar 04, 2008 1:40 am
Posts: 7096
Location: b8bbd5
Reply with quote
Post Re: Discussion - Scientific progression and morals
Speaking of Sci-Fi, I would recommend you read the Ender's Game series for a more evenhanded treatment of the topics of AI, bioengineering, and technology as a whole. Throughout the storyline, technology is used as a tool and nothing more, and some characters use it for good and some for evil. The Government uses it to both humanity's detriment and continued survival, and good intentions are illuminated to be just as susceptible to corruption as your local politician.


Mon Mar 14, 2011 6:17 pm
Profile
DRL Developer
DRL Developer

Joined: Fri May 15, 2009 10:29 am
Posts: 4107
Location: Russia
Reply with quote
Post Re: Discussion - Scientific progression and morals
Yeah except humans didn't make the monoliths.
Evil AI overlord decides to eradicate human resistance, sends robots back in time, robots get salvaged, creates evil AI overlord.



Both of those scenarioes contain extremely unusual factors, eg, huge alien monolith, or a time traveling robot that gets salvaged to create an AI for taking care of the US military.


Mon Mar 14, 2011 8:23 pm
Profile
DRLGrump
User avatar

Joined: Tue Nov 07, 2006 1:26 am
Posts: 2037
Location: Jerking off in a corner over by the OT sub-forum
Reply with quote
Post Re: Discussion - Scientific progression and morals
God I love watching Allstone.


Mon Mar 14, 2011 8:59 pm
Profile
Data Realms Elite
Data Realms Elite
User avatar

Joined: Sun Nov 01, 2009 3:00 pm
Posts: 4144
Location: Hell.
Reply with quote
Post Re: Discussion - Scientific progression and morals
Tomaster wrote:
God I love watching Allstone.

I'm inclined to agree with you. He is just so chill


Mon Mar 14, 2011 9:55 pm
Profile WWW
Data Realms Elite
Data Realms Elite
User avatar

Joined: Tue May 25, 2010 8:27 pm
Posts: 4521
Location: Constant motion
Reply with quote
Post Re: Discussion - Scientific progression and morals
I was thinking exactly the same thing.


Mon Mar 14, 2011 10:15 pm
Profile
User avatar

Joined: Wed Jan 07, 2009 10:26 am
Posts: 4074
Location: That quaint little British colony down south
Reply with quote
Post Re: Discussion - Scientific progression and morals
Culthero wrote:
<HAL committing what equates to murder and attempted murder>
The story of Space Odyssey is based around the idea that the superiors poorly programmed HAL without thinking to do a desk-check, a run-through or thinking out what they were doing to a great degree. Had they asked a HAL if doing this was a good idea, it would have replied in the strongly negative.
Culthero wrote:
<A robot harming multiple humans>
Here, the Terminator avoids lethal or seriously incapacitating shots in order to complete a mission that intends to save a majority of human life world-wide. If they instead sent a squad of humans back in time, there would have been significant deaths on both sides instead of injuries. The greater technological antagonist is formed as a result of a huge system coming online with large amounts of power without any significant testing.

In both cases, modern guidelines and standards on testing and quality assurance would have completely prevented any significant occurrence.

Nonsequitorian: The re-birthing of extinct life-forms offers a huge amount of potential benefits, but would require rather strict security measures. A particular example would be the wide variety of plants which we have yet to chemically analyse and yet may go extinct before we have the opportunity to or already have. The number of medicines we derive from plants is astounding, and greater opportunity to do so would result in a similarly increased number of medicines available to combat illnesses.


Tue Mar 15, 2011 7:57 am
Profile WWW
User avatar

Joined: Thu Mar 04, 2010 9:07 pm
Posts: 126
Location: Turkey
Reply with quote
Post Re: Discussion - Scientific progression and morals
I have read your criticisms. The extinct argument looks legit as we can't hope to turn the gears of time back and already caused irreversible damage to earth and its other inhabitants. But still it seems to me that our science is solving things with too complex inventions at times and wrong applications at other times.

Still you can't prevent a nuclear reactor from going critical for a meltdown or blasting after a Tsunami or an earthquake and you have to evacuate lots of people, and a lot of others may still die because of the chance of wider contamination. These can, of course, in turn be prevented by taking further measures. But, in application or reality governments and corporations don't usually get all the necessary measures as recommended by men of science; the lack of failsafes or reduced safety at application is a big problem. And people are left alone with the dragon someone else has created. Wouldn’t it be better if we have used less risky methods such as tidal or geothermal power stations? "Modern guidelines would have" been more careful does not really cut it as long as the motive of either the scientist or the applier/seller is money, power, or glory. That's why we are discussing progress and morals. Good motives or intentions will not always cut it, too, but in their absence you are sure to have more manmade disasters and more tragedy. How is creating an A.I. that can possibly outsmart you is safe for humanity? I agree with the fact that an A.I. which surpassed our own mental capabilities would be a literally monumental discovery in the sense that we would be leaving our child behind in the case of total annihilation. However, I wouldn’t be so sure about it nursing us in the senile years of our species, waiting for the time of our final procession rather than replacing us immediately after gaining the power to succeed us.

And tell me if there would be any scientific progress at all had it not been for the fiction writer throwing a stone into the bottomless well for the scientist to take out? Scientists invent/discover. We apply it. It makes an impact on our lives and the writer's life the same. Then he/she writes about "what would be if..." and after that the cycle begins anew with the scientist making a fresh discovery after reading “what would be if...” (The "discovery-application-impact" cycle is from Future Shock by Toffler). Simply put; fiction is what makes us tick and go on.

I will finish with an anecdote: Back in 1991, during the first Gulf War, I and my family were living in a town at the Iraqi border. There were all news about the war and the pieces of war machine that the US and Iraq possessed in the papers. While I read about various planes, tanks, bombs, and guns as a wide-eyed 11 year old kid and wondered how could they be built or how awesome they should be, I was still afraid of the firepower that could get us at the push of a finger; even if we sat in the bunker of a blacked out city of a country that wasn't even at war.
Maybe you are right. Maybe I don't know what progress or scientific progression means. But I have lived long enough to understand that "Necessity is the mother of invention" not excess.

We are already too fragile and vulnerable. Why make things worse by conceiving Frankenstein's monsters while we are still in bed with consumerism and greed?


Tue Mar 15, 2011 11:24 pm
Profile
User avatar

Joined: Wed Jul 01, 2009 11:46 pm
Posts: 1930
Reply with quote
Post Re: Discussion - Scientific progression and morals
I'm sorry, but geothermal and tidal energy don't produce nearly as much energy as nuclear does, geothermal sites aren't exactly growing on trees up the yin-yang, and correct me if I'm wrong but aren't tidal plants both expensive and a relatively-new technology?

And what I really don't get, is that you're saying scientific progression and research is a bad thing while saying that instead of using the bad-science nuclear, we should use geothermal and tidal plants. Uh...who do you think developed those? Some guy in Peru in his back yard?


Tue Mar 15, 2011 11:44 pm
Profile
User avatar

Joined: Wed Jan 07, 2009 10:26 am
Posts: 4074
Location: That quaint little British colony down south
Reply with quote
Post Re: Discussion - Scientific progression and morals
Culthero wrote:
I have read your criticisms. The extinct argument looks legit as we can't hope to turn the gears of time back and already caused irreversible damage to earth and its other inhabitants. But still it seems to me that our science is solving things with too complex inventions at times and wrong applications at other times.
These being? If an invention is too complex, then the control mechanisms would be similarly powerful before it would be applied.
Culthero wrote:
Still you can't prevent a nuclear reactor from going critical for a meltdownor blasting after a Tsunami or an earthquake and you have to evacuate lots of people
The reactor was built to withstand earthquakes of a magnitude of about 8, if I remember correctly. The earthquake in Japan was of magnitude 9, which is about 32.6 times more powerful than the plant was designed to withstand. I can basically guarantee you that the replacement reactor will not be built to similar specifications. For a reference point, it is the most powerful earthquake we know japan to have suffered and shifted earth's axis 10 cm east.
Culthero wrote:
and a lot of others may still die because of the chance of wider contamination.
Modern Japan is able to distribute radiation medicines in sufficient amounts to avoid almost all deaths.
Culthero wrote:
These can, of course, in turn be prevented by taking further measures.
As aforementioned, it is basically impossible to have predicted an earthquake almost six times more powerful than any other earthquake known to have occurred in the region and building any kind of structure to withstand that would have gone extremely far beyond lee-ways already put in place.
Culthero wrote:
But, in application or reality governments and corporations don't usually get all the necessary measures as recommended by men of science
The 'men of science' fully reasonably didn't expect an earthquake six times more powerful than any other earthquake they experienced in all of japan's history.
Culthero wrote:
the lack of failsafes or reduced safety at application is a big problem.
There would have been an extremely large amount of failsafes, but again, 6 times more powerful than any other earthquake in japan's history. Quite literally unprecedented amounts of force applied to the power plant.
Culthero wrote:
And people are left alone with the dragon someone else has created.
A more apt analogy is of a fire upon which a whole community has been able to cook for years being safely held in a fire-pit in a region with some wind, but not much. One day, the wind reaches levels that crush huge amounts of the community whilst also picking up the fire and making the area dangerous to deal with. Using the word dragon has perhaps unintended but nonetheless inaccurate connotations.
Culthero wrote:
Wouldn’t it be better if we have used less risky methods such as tidal or geothermal power stations?
Both of these power sources require specific conditions, almost none of which Japan fulfils.
Culthero wrote:
"Modern guidelines would have" been more careful does not really cut it as long as the motive of either the scientist or the applier/seller is money, power, or glory.
If you don't follow the guidelines, you are almost completely barred from any of the three.
Culthero wrote:
That's why we are discussing progress and morals. Good motives or intentions will not always cut it, too, but in their absence you are sure to have more manmade disasters and more tragedy.
The best of motives unguided by guidelines will routinely lead you into worse disasters. You are also still assuming, for reasons I cannot fathom, that scientific researchers lack good motives or intentions. I am unsure of the requirements where you live, but in Australia becoming a scientific researcher takes much, much more work, effort and dealing with people that will treat you poorly than is otherwise necessary, especially when there are several places along the way where you would easily be able to stop short and instead become something that provides significantly more money and easier conditions.
Culthero wrote:
How is creating an A.I. that can possibly outsmart you is safe for humanity?
To improve on what we can fathom. For the same reasons that we educate our young and train our workers.
Culthero wrote:
I agree with the fact that an A.I. which surpassed our own mental capabilities would be a literally monumental discovery in the sense that we would be leaving our child behind in the case of total annihilation.
It takes months of research to analyse chemicals that might be useful in medicine. A good amount of the time, that money and time will have been completely wasted. If an AI could discern which chemicals are likely to yield results, then you would get a dramatic increase in the number and variety of medicines from which doctors may choose treatment methods.
Culthero wrote:
However, I wouldn’t be so sure about it nursing us in the senile years of our species, waiting for the time of our final procession rather than replacing us immediately after gaining the power to succeed us.
You seem to think that AI has intent out of the box. An AI can only know what is given to it, and linking it up to basically anything without rigorous testing beforehand would not only be stupid, but against a huge number of scientific research guidelines which you would be following by necessity in order to even think about making it do anything of note. Additionally, you seem to be ascribing qualities of individual organisms to an entire species, which, whilst it may be poetic, lacks much merit when it comes to actual description.
Culthero wrote:
And tell me if there would be any scientific progress at all had it not been for the fiction writer throwing a stone into the bottomless well for the scientist to take out?
I have an enormous respect for science-fiction writers. I have a significant amount of respect for those that are able to properly interpret and use the imaginings of science-fiction writers. One of the ways to improperly interpret and use their imaginings is to completely skip the thought experiments and greater underlying discussions and head straight for the literal occurrences in the work. In my opinion, it both produces poor decisions and neglects the work of the author.
Culthero wrote:
Scientists invent/discover. We apply it.
Other scientists apply it, actually. Basically all significant research will require a large amount of expertise to actually apply. To apply research, you are going to have to understand it at some significant level.
Culthero wrote:
It makes an impact on our lives and the writer's life the same. Then he/she writes about "what would be if..." and after that the cycle begins anew with the scientist making a fresh discovery after reading “what would be if...” (The "discovery-application-impact" cycle is from Future Shock by Toffler). Simply put; fiction is what makes us tick and go on.
This implies that fiction is a prevalent or major source of inspiration for scientific research. I highly doubt the former, and am fairly doubtful of the latter.
Culthero wrote:
I will finish with an anecdote: Back in 1991, during the first Gulf War, I and my family were living in a town at the Iraqi border. There were all news about the war and the pieces of war machine that the US and Iraq possessed in the papers. While I read about various planes, tanks, bombs, and guns as a wide-eyed 11 year old kid and wondered how could they be built or how awesome they should be, I was still afraid of the firepower that could get us at the push of a finger; even if we sat in the bunker of a blacked out city of a country that wasn't even at war.
The Tsar Bomb, if dropped on Sydney, could perhaps spread injurious levels of radiation to Melbourne, where I live. The RPG-7 weighs less than ten kilograms and can deliver rounds that can penetrate 750mm of specialised armour or temporarily make the temperature of an enclosed area hot enough to vaporise flesh. The PAW-20 is a man-portable automatic grenade launcher that fires grenades that detonate near the aimed at enemy, even if said enemy has ducked behind cover. The GAU-8 Avenger fires massive rounds at such a rate that it would be possible to measure its power in terms of tanks killed per second, the answer being above one. The Advanced Tactical Laser, or ATL, is a 100kw laser being developed for the AC130 gunship. Being a laser, it would be almost completely accurate to line of sight. A few dozen kw is able to kill or seriously injure a human. There are weapons that make it theoretically impossible to escape. As long as there is conflict, people will kill each other. It is not lack of science that will stop this - we already have almost all we could want in terms of killing fellow humans - it is science applied to taking away the reasons for conflict. With cleaner and more powerful energy generation you can prevent resource wars. With better agricultural science, you can prevent food wars. With better understanding of psychology you can at least make opposing sides in ideology wars begin to understand each other, which leads to the cessation of ideology wars.
Culthero wrote:
Maybe you are right. Maybe I don't know what progress or scientific progression means. But I have lived long enough to understand that "Necessity is the mother of invention" not excess.
That is generally taken to mean that in places where there is necessity, people will invent. Your interpretation, however, is quite interesting and raises important questions. I say this, then: if necessity is the mother of invention, then resources must be its father. Without resources to be directed to invention, invention withers and dies. The world is still full of necessity: hunger, depleting resources and disease. Science is what will solve these things, not anything else.
Culthero wrote:
We are already too fragile and vulnerable.
Humans are definitely frail; the average life expectancy of men in days past has been half or less of what it is now. Science is not what makes us fragile and vulnerable, it may work both ways, but mainly works towards empowering us.
Culthero wrote:
Why make things worse by conceiving Frankenstein's monsters while we are still in bed with consumerism and greed?
It is interesting that you chose Frankenstein's monster. It is men that labelled Frankenstein's creation a monster - the creature himself was full of good intention and would have helped people immensely had he been properly fostered and not judged on appearance and with fear. So many things people call Frankenstein's monsters fit the same description. Given the right fostering, genetically modified food has the capacity to save millions of hungry people around the world, and has been shown to be able to do so. If the concept is left to be developed only by those that only wish to harm, then of course the results will be terrible.


Wed Mar 16, 2011 9:12 am
Profile WWW
User avatar

Joined: Mon Oct 25, 2010 5:51 am
Posts: 1198
Location: Sydney
Reply with quote
Post Re: Discussion - Scientific progression and morals
Go, Allstone! Are we keeping score? Wow that is one long post...


Wed Mar 16, 2011 11:13 am
Profile
Display posts from previous:  Sort by  
Reply to topic   [ 63 posts ]  Go to page Previous  1, 2, 3, 4, 5  Next

Who is online

Users browsing this forum: No registered users


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group.
Designed by STSoftware for PTF.
[ Time : 0.171s | 15 Queries | GZIP : Off ]