National Bank
The united States pays 227 billion dollars in interest annually. If 90% of this cost was alleviated, due to the National Banking system being a non-profit low interest system designed to benefit all parties involved, roughly 204 billion dollars could be saved annually.
The United State's already guarantees and backs most banks in the United States, agreeing to provide them with monetary funds to pay back investors should many of them withdraw at any point in time. Due to the fact that the U.S. backs the bankers, the U.S. is in theory a bank to the bankers.
If a non-profit low interest rate system was developed (only enough interest to run the program, and pay employees etc.) and funded by the United States, theoretically an enormous amount of debt relief and interest problems could be removed from the American people. A 30 year loan on a house at a 3.5% interest rate (standard market interest rate) would equate to roughly 105% extra money being paid on a house, doubling the cost for housing. If 100% of this was eliminated, housing prices could literally be cut in half; in other words if a low interest, or a 30 year interest rate was reduced to merely 5% at the culmination of 30 years, the small portion necessary for storing, loaning money, and various processing fees being ignored, housing prices could literally be cut in half.
Not only would this help the United States but also reduce the price of housing, car loans, and most other costs drastically.
Most major banks receive the majority of their income from corporations and other businesses, as well as individuals well over the 1% in annual income rates, allowing major interest and debt relief for roughly 98% of the population to occur without affecting the Banks primary income rates. Not only could this benefit the American people and the United State's budget, but also allow for a more seamless transaction between banks and the U.S. reserve's.
If the U.S. created a banking system, they already pay for and protect most banks in the U.S., installing and creating security systems and providing police supports for robberies. As a result, the cost of switching over to a U.S. created system wouldn't put more stress on the government to in essence, build a banking infrastructure, as they already do pay for the majority of it, while banks produce little. Because banks do not lose money in banking transactions, as no physical commodity is produced, banks will not lose profit on already built capital (such as failing to sell 1 million automobiles for next year, as banks make profit purely off of more people being available). Therefore, cutting down business will not make them lose already spent money, suggesting that profit gains and losses will be directly proportional. Banks will not be obscenely hurt, and the workers in these banks could be easily switched over to officially work for the government, getting rid of the middle men and saving billions.
Friday, August 31, 2012
Random Idea
Random Idea
Take away taxes on consumers, increase them on business, better for average person, no net loss for business or government.
Basic equations!
40,000(dollars) x 25(workers) = 1,000,000 (in market)
Gross product, 1,000,000 (all of market). Workers spend money back on market, to get things, to live. Workers break even. Company makes 1,000,000 in sales. 1,000,000
10% tax rate = 36,000 (dollars) x 25(workers)= 900,000, x 10% again (taxes) = 190,000 in taxes, 36,000 dollars a year and 810,000 for market after taxes.
Increase taxes to 19% on business, decrease to 0% on consumers. 40,000 x 25 = 1,000,000 x .19 = 190,000, or taxes before.
Except! More products to consumers. Maybe. Idk. O_O
Take away taxes on consumers, increase them on business, better for average person, no net loss for business or government.
Basic equations!
40,000(dollars) x 25(workers) = 1,000,000 (in market)
Gross product, 1,000,000 (all of market). Workers spend money back on market, to get things, to live. Workers break even. Company makes 1,000,000 in sales. 1,000,000
10% tax rate = 36,000 (dollars) x 25(workers)= 900,000, x 10% again (taxes) = 190,000 in taxes, 36,000 dollars a year and 810,000 for market after taxes.
Increase taxes to 19% on business, decrease to 0% on consumers. 40,000 x 25 = 1,000,000 x .19 = 190,000, or taxes before.
Except! More products to consumers. Maybe. Idk. O_O
Global Warming
Global Warming
So far, there is little concrete proof that global warming is being caused by a predominately man made carbon dioxide driven greenhouse effect as presented by the IPCC and a few other organizations (although in the case of NASA and the EPA, not all members of the organization necessarily agree).[1][2][3] While carbon dioxide may be having some effect, there is little evidence to indicate that it's a significant issue, or predominately responsible for any of the arbitrary warming that may theoretically be occurring.
Global warming works in many ways, but the carbon dioxide driven greenhouse effect works by a small band or layer of carbon dioxide in the mid troposphere making the effect of carbon dioxide relatively stronger per unit than water vapor.[1][2][3] The Ozone layer blocks much of the UV that comes to earth; Visible radiation going through the earth's atmosphere, and a small amount of UV, is absorbed and reflected back up into the Atmosphere at much lower frequencies, or in the infrared zones, which is opaque to various greenhouse gases, including water vapor. Greenhouse gases absorb and then reflect or re-radiate the infrared radiation, which is produced by the earth from visible light and other forms of radiation, in nearly all directions, some of it down, back towards earth which prolongs it's time in the atmosphere and warms up the earth.[1][2][3] Water vapor is at roughly 20,000 parts per million in the atmosphere, compared to a maximum of 400 for carbon dioxide, or exists in over 50 times the volume as carbon dioxide. This makes the effect of carbon dioxide relatively minor in comparison to the effect of water vapor and other more powerful and present greenhouse gases, near the surface.[1][2][3][4][5]
However, the reality is somewhat more complex for the carbon dioxide driven greenhouse effect. The atmosphere near the surface is largely opaque to thermal or infrared radiation (with exceptions for "window" bands which let some of the heat through), and most heat loss from the surface is by sensible heat and latent heat transport, or more or less direct heat transfer. [1][2][3] Radiative energy losses become increasingly important higher in the atmosphere largely because of the decreasing concentration of water vapor, an important greenhouse gas. It is more realistic to think of the greenhouse effect, with carbon dioxide, as applying to a "surface" in the mid-troposphere, which is effectively coupled to the surface by a lapse rate. This particular area of carbon dioxide is far more important on the warming effect of the earth than it otherwise would be as a greenhouse gas due it's increased concentration and the lack of interaction from the water reflecting most of the infrared back down. [1][2][3][4][5][6]
If carbon dioxide does not reach this layer, which carbon dioxide produced from the surface, such as with cars and animals, rarely does, since it is denser than air and clumps at the surface and even in the mid troposphere,[1][2][3][4] it has little effect on this form of the greenhouse effect, making it relatively unimportant, which due to it's small amount in comparison, makes it relatively negligible. Unless the carbon dioxide reaches this layer in the mid troposphere, which being heavier than the atmosphere and clumping mostly to the surface due to the fact it does not diffuse through it, let alone uniformly, it's effect is relatively negligible. Due to the manner in which it reflects radiation back down, this surface in the mid troposphere is much thicker than is required to warm the earth; because so little gets past the surface of this band of carbon dioxide, increasing the thickness of this layer would also be mostly negligible in warming the surface. It's as a result of this that increasing carbon dioxide levels, produced from cars, fires, and other man made objects, are relatively insignificant.
In other words, the total volume of carbon dioxide is not the worry, but it's distribution. Since it does coat practically all of the mid troposphere, albeit unevenly, and with important exceptions for window bands considered, it reflects nearly all of the of radiation that can be reflected (predominately in the infrared spectrum) back down. This suggests that increasing levels of carbon dioxide will have a negligible impact; as long as there is a near complete cover of the earth's mid-tropospheric Atmosphere, even unevenly, forming a virtual wall, it will reflect most of the infrared back down; increasing levels do not change it's effects, as evidenced by how it operates and satellites which have proven no increase of temperatures over areas with higher carbon dioxide levels in their specific mid-tropospheric regions. In other words, while some areas have increased and decreased carbon dioxide levels, the levels do not seem to be affecting temperatures specifically nearly at all. [1][2][3][4]
Indeed, it was originally assumed carbon dioxide had a near even spread partially as a result of this near even infrared reflection. Areas with higher carbon dioxide levels do not tend to necessarily produce more heat. Areas over the equator tend to have less carbon dioxide than areas in temperate zones, yet their temperatures are often higher[1]. While important to the global warming cycle, relative power cannot be measured on a unit to unit basis; indeed, the carbon dioxide in the ocean and near the surface is considered less important than that of which is higher up in the atmosphere and that of which is in the mid troposphere. The importance of carbon dioxide in the greenhouse cycle is not dependent on amount, but distribution in the atmosphere; this also means that increasing the amount in important areas of distribution will likely have a negligible effect on warming. This means increasing levels, if they do increase, will likely have negligible effects except for surface increases, of which carbon dioxide is one of the weakest greenhouse gases in comparison to natural gas, nitrous oxide, and even water vapor in this form.
The amount we do produce is also being rapidly absorbed by trees, the ocean, and other waterborne carbon dioxide consuming creatures such as algae. NASA individuals, using the amount of carbon dioxide that is predicted to be produced by the IPCC, proved that it would at least have to be much cooler in the same given time frame, given how much carbon dioxide is likely to be absorbed by these sinks, even assuming it made it to the mid troposphere and amplified it's effects, which is unlikely. In a new paper in Geophysical Research Letters, NASA scientists estimate that doubling atmospheric carbon dioxide will result in 1.64 degrees Celsius of warming over the next 200 years, max. As stated by NASA the IPCC Did not allow the vegetation to increase its leaf density as a response to the physiological effects of increased CO2 and consequent changes in climate. Other assessments included these interactions but did not account for the vegetation down regulation to reduce plant’s photosynthetic activity and as such resulted in a weak vegetation negative response. According to NASA; "When we combine these interactions in climate simulations with 2 × CO2, the associated increase in precipitation contributes primarily to increase evapotranspiration rather than surface runoff, consistent with observations, and results in an additional cooling effect not fully accounted for in previous simulations with elevated CO2." [1][2][3][4][5][6]
There are also a lot more radiative energy losses in this carbon dioxide zone than has been suggested by the IPCC, as well. According to some individuals at NASA, it's significantly less. While the carbon dioxide reflects virtually all the infrared back down to earth, only about 50% of the radiation produced by the earth is infrared and a certain percentage of the energy is lost as latent and sensible heat, reducing it's effects, in addition to the fact that heat is absorbed by the atmosphere, which is then lost by it's expulsion. Even assuming an increase would have a significant effect, it would be much, much less, as a result. Even while NASA proclaims the importance of carbon dioxide in the warming cycle, it does not state that increasing it will have a significant effect. -??[1][Remote Sensing PDF]
Carbon dioxide levels taken from ice core drilling are routinely used to measure temperatures of previous ages. There is a connection between warm weather and carbon dioxide, but it is not the carbon dioxide causing the warming. When the oceans warm, the amount of carbon dioxide dissolved into them decreases, due to the fact that warmer waters cannot store as much carbon dioxide in them, much like how colder carbon dioxide drinks stay fizzier longer (warm drinks can, but they have to be held under pressure, which is why warm soft drinks often explode in the heat). As a result, carbon dioxide is released as a result of the oceans warming, which serves as a good measurement when relative measures are taken from ice core drilling to figure out carbon dioxide levels, influence by the ocean, since over 70% of the carbon dioxide released is from the ocean.[1][2][3][4][5][6] It should be noted that since carbon dioxide increases when it's warm, and not the other way around, that carbon dioxide released into the atmosphere does not exponentially warm the earth or else no cooling events would happen; if anything, the carbon dioxide release from the oceans after a warming event would seem to cool it down, since it has cooled since these times, compounding the issue of carbon dioxide being predominately responsible for the warming. If carbon dioxide did result in change, and the majority of it comes from the oceans based on their current temperatures, then where did the rest of the carbon dioxide come from? ...?
The oceans rising due to thermal expansion and the melting of the ice caps is also silly. Most ice in the oceans are stored under water[1][2][3], and water expands when frozen, suggesting that the ice melting under water, if anything, should decrease the ocean's levels. As well, the maximum density of water occurs at 3.98 °C (39.16 °F), while it expands while under 0 degrees Celsius, or while frozen. While the surface temperature is often some 60 degrees, the water beneath the surface makes up the most significant portion of water, and on average is some 0 °C (32 °F) to 3 °C (37 °F). This means that unless we have a sudden, extremely sharp change or increase in temperature, the ocean levels should actually decrease slightly from slightly increase heat, and not rise, if a significant change will occur at all, simply due to the vastness of the ocean and the somewhat irrelevant nature of atmospheric temperatures in relation. [1][2][3]
We do not have as many carbon dioxide producing chemicals as the IPCC States. Consumption, and therefore production of carbon dioxide, is expected to increase over 200 years. [1]
The world has roughly, in proven reserves, 1,324 billion barrels of oil, 300 trillion cubic meters of natural gas and 860 billion tons of coal.[1][2][3] The worldwide consumption of oil is roughly some 31.4 billion barrels per year, while worldwide consumption of natural gas is roughly 3.2 trillion cubic meters a year, and the worldwide consumption of coal is roughly 7.25 billion tonnes. At the current rate of consumption, this would mean running out of gasoline in 42 years, natural gas in 93.75, and coal in roughly 118 years. Gasoline represents some 40% of total fossil fuel consumption, and in 2008 energy by supply was oil 33.5%, coal 26.8%, gas 20.8%, out of the total energy consumption. [1]
The idea that we're going to increase temperatures substantially despite our lack of these primary carbon dioxide producing materials is mostly unfounded. If our consumption increased, by finding new forms of fossil fuels (which is possible, such as natural gas at the bottom of the ocean and other potential unfound or untapped reserves) perhaps it would be possible to extend this figure, but considering that global usage is expected to go down with a rationing of resources and improvements in fuel efficiency it's even further, less likely.
It should also be noted that warming is only occurring in key, isolated places, such as parts of Africa, Australia, and Alaska. Specific areas of the earth warming does not mean that the entire earth will warm, and effects over the whole earth from say, an area of 32 degrees suddenly turning to 34, are unlikely, since these areas will likely remain unaffected, since a total global average is increasing, but the entire earth is not warming equally.
Some Satellite Data
So, increasing surface carbon dioxide levels will have a negligible effect in warming the surface. Definitely not anything as high as a degree or so even if the next 100 years. However, if it were a result of a carbon dioxide driven greenhouse effect, we'd see a rise of temperature in the mid troposphere proportional to that on the surface. Do we?
Well, no. Satellites and weather balloons have documented little if any change[1][2][3][4]; even the IPCC's satellite documented little change[1][2]. The IPCC's official stance on the situation is there is "net spurious cooling". However, looking at the satellite data, it was possible to come up with a possible conclusion for why there seemed to be little if any change. It was possible that the orbital decay calculations on the satellite were off as it got closer to the atmosphere and eventually fall back through due to the earth's gravity increasing exponentially as it neared and due to atmospheric distortions from increased solar periods increasing UV and effects on the atmosphere. The problem with these calculations are, that orbital decay was already calculated for; assuming we were to recalculate this, the belief was that, as the satellite got closer to earth, the view of the satellite would be off due to the curvature of the earth. However, Microwave Sounding Unit data doesn't necessarily change with the angle of incidence to the earth, since there is a varied gigahert and aperture range. It's possible there would be less coverage of the earth, although this would simply present less data, and not necessarily a more negative trend (unless a series of coincidences were to occur). For all intents and purposes, if it did, it would suggest that the mid troposphere was warming more than it should have. As the angle of incidence increases with the earth, this would take the microwave sounding data longer to get to get from the satellite to the earth and back, given that the angle from the satellite to the ground would increase, hence increasing the length of the virtual hypotenuse. As a result, microwave data would take longer to get to the satellite, indicating what could be perceived as a longer hertz range, or a decrease in air pressure, which could be perceived the result of warming, and air expanding. (This impact would likely be negligible, however). In any case, this would require the orbital decay of the satellite to nearly have exactly matched the temperature change on the surface of the earth, proportionally, which has not been recorded by any other satellite, weather balloon, and would be increasingly improbable. Even if somehow it was slightly off in a perfect direction, with every satellite and weather balloon's temperature gauges perfectly slightly off to measure virtually the same temperature for random and various reasons, all evidence gathered to come to this conclusion would be scientifically and mathematically unfounded, suggesting a still unexplained cause for something that happened to effect every satellite and weather balloon equally, suggesting a far larger issue with a lack of the fundamental understanding of specific sciences that would compound the issue far beyond the scope of global warming, meaning global warming would be the least of our worries.
To directly compare MSU2R with radiosondes, a surface temperature layer is added to the radiosonde layers, and a vertical integration over all layers is done to compute and effective MUS2R trend of -0.02K per decade, instead of -0.05K, which is closer agreement with the observed +.07K per decade trend. Even so it still displays a negative trend; decreasing or not, essentially the aspect is, the mid tropospheric data is not complete nor indicative of being where global warming would suggest even over compensating for heat, which gives a much higher heat increase than would be expected, as well. Basically, the data does not suggest an increase in global warming as a result of the carbon dioxide in the mid troposphere, and potentially even records the opposite effect. [1][2][3]
While orbital decay could theoretically be compensated for, it does not negate the satellite data, as at best it is still cooling -.02K per decade, according to that data.
Even if the changes are "spurious", they could still exist, so the data should not be construed to reflect a predicted model, in any case.
The fallibility of Temperature Measurements
Correlation Between carbon dioxide and temperature; heat is going up, carbon dioxide is going up, therefore there must be a connection? While there might be, it seems to be rather inconsistent. Should it be atmospheric warming, it should be even and a direct result of increased carbon dioxide, but the figures are relatively random. [1][2][3] While many "positive" links have been asserted, they have not in fact, proven a direct correlation with carbon dioxide, which if it is carbon dioxide, there theoretically should be. Carbon dioxide increases, earth heats up X amount; supposedly. But what the data shows, more or less, is no direct connection between heat and carbon dioxide. There are wild and variable temperature changes, even over long periods of time, but carbon dioxide has increased steadily, without a steady increase in temperature, even if it can be average. If climatologists know what they are talking about, then on another planet, like earth, say some 10 degrees cooler, how much would X degree of carbon dioxide increase that planet's temperature? The fact of the matter is, all that's been measured is an increase in temperature, and a theoretical increase in carbon dioxide, and if those two correlate the same, then X amount of temperature increase could be expected. However, the temperature may have increased regardless of carbon dioxide, due to other factors, other greenhouse gases and may even simply have been arbitrary or a slow warming as getting out of the ice age.
It should also be noted that most warming has occurred in the last 20 years, that is calculated within the 100 year data. This has also been surface temperature increases, and not necessarily atmospheric increases. While the data presents various ups and downs that easily factor out, it is only a result of these last 20 years that we see massive increases in temperatures. It may simply be that the last 20 years have been unusually warm, with no real direction connection to human activities. Measuring the earth's average temperature when it's been the hottest it has been in the last 1300 years, as a baseline for global warming, may indeed produce a biased result.
Additionally, we are just out of a "little ice age". [1][2][3] Temperatures, from roughly 1300-1850 A.D., were around 1°C cooler. If the earth was warming, this would be consistent with re-normalizing to regular trends, and wouldn't denote any significant increase afterwards. Many theories exist as to why this occurred, many more suggest it was potentially localized in specific areas, but possibly the one that ought to be considered the most is the arbitrary variability and fluctuations of weather. Even according to the IPCC, the 1 degree difference was rather "modest" and probably was ineffectual, suggesting a recent warming may be just the same, as well.
Weather balloons didn't begin to monitor weather until about 1896, by a single French Meteorologist,[1] and didn't become accurate, stable or consistent until the 1950's. The first satellite in space, the Soviet Union launched Sputnik, first reached space in 1957. Weather monitoring did not occur until some time after this. The world meteorological society was not produced until the 1950's, The International Meteorological society, (IMO), which was founded in 1873. Antarctica, Alaska and other important places did not begin substantial weather monitoring until some time later. The notion that surface temperature measurements in the 1850's are a "good enough" measurement, when such information would be disregarded completely if taken today, without accurate atmospheric measures, different altitude measurements and a multitude of other factors, is silly.
While ice core drilling has shown correlations between carbon dioxide and temperature, this is most likely a lag in production. When the oceans, or pretty much any water warms, they release carbon dioxide; when they cool, they absorb more carbon dioxide. As the oceans warm, they will release more carbon dioxide, and vice versa; a little bit of this carbon dioxide is often trapped in ice, revealing relative carbon dioxide levels of a given timeframe. This means that, likely, most of the carbon dioxide found during warm temperatures is likely a result of warmer oceans, and not the other way around.[1][2] Compounding the issue, of carbon dioxide was the primary cause of an increase in global temperatures, than as they increased, increasing the carbon dioxide levels even further, the earth would have never cooled back down, which is has considerably since these times. It should also be noted that levels of carbon dioxide could also theoretically be times of great cold, in accordance with increased volcanic activity, which can cool down the earth.
This also means that the issue involving "the most carbon dioxide in 650,000 years"[1][2][3] could likely be explained by the fact the ice age began at the Pleistocene Epoch some 2.6 million years, and it has consistently gotten warmer and the carbon dioxide levels have risen from 10,000 years ago.
How even if true it's not the whole world evenly, so it's obviously not an equal effect from carbon dioxide. Carbon dioxide levels are lower in the tropics and at the equator than in most temperate zones, yet it is substantially warmer in these areas. It's likely the amount of carbon dioxide has little if any variation on these temperatures. This is due to the band of carbon dioxide existing beyond a certain amount having the impact; the amount of carbon dioxide does not matter, but it's distribution, and if it coats the mid troposphere entirely, albeit unevenly, the same general greenhouse effect will occur equally by reflecting radiation back down towards the earth, but water vapor and other greenhouse gases will determine how much is absorbed.
But what of Venus?
Venus has some odd 96.5% of their atmosphere being carbon dioxide, while at max earth is some 0.038% carbon dioxide (or 380 ppm). This automatically produces a 2540 times difference of carbon dioxide between Earth and Venus; considering that it's atmosphere is some 91 times denser, this puts the carbon dioxide levels at roughly 228,552, or 230,000 times more than earth. Assuming Venus is 800 degrees warmer (which it is less than this), this would only mean a .00347 increase in temperature for doubling the earth's carbon dioxide.[1]
It should be noted however that Venus's carbon dioxide likely came after the warming effect, when the oceans evaporated and dissociated into hydrogen and oxygen due to solar radiation, and that very little light reaches Venus's surface since it is reflected, the atmosphere is incredibly thick, and the impact of carbon dioxide is minimal.
Not that Venus is necessarily a good analogue for earth.
What to take from This
Global warming, being a significant trend to worry about, may be false; it's likely the earth is warming due to arbitrary weather patterns that cycle without much impact from human activity, if it's warming much at all.
However, gasoline and other fossil fuels are expensive and increasingly harder to get ahold of. Mercury levels in the ocean, predominately a result of burning coal, are so high that the FDA recommends lowering the amount of fish people eat due to the fear of mercury poisoning and mercury build up. All the pollution and materials we create go into the atmosphere to be breathed in and rained back down and absorbed into drinking water and habitats of animals we consume, not only hurting our ecosystem but potentially ourselves, as well.
We have maybe 40-60 years worth of cheaply available gasoline left, and 120 years of coal, at the current rate of consumption, which our rate of consumption is set to increase in the future, to potentially double these levels by 2050, compounding what little fuel we may have left by this time. If we don't switch our fuel supplies over to cheaper, less polluting and more available options, such as Thorium or burning gasoline in steam turbines and then using algae to capture the exhaust, for improved efficiency and safety, we may all suffer, economically, strategically, and with our health.
Even if global warming is untrue there is no detriment to improving our current energy situation and potentially having energy independence, potentially in the U.S. or country of origin, to be self reliant and not rely on foreign intervention or resources.
If the globe is warming, whether arbitrarily or by a result of some other mechanism, it is still important to understand this so we can understand the effects.
Another important thing to consider is that the scientific institutions purporting global warming are not necessarily wrong. They may have proposed ideas, but it was only because of the evidence they provided that it was capable to potentially prove them wrong.The assessment of isolated individuals within these institutions going off of raw numbers is a potentially valid figure for what those figures would produce, however, when considering variables, such as the current temperatures of the oceans, their vastness, the method of carbon dioxide's warming, rather than equating an increase in heat or change based on a raw unit to unit variable, a more clear picture becomes available, and we advance our scientific understanding of the world.
It should also be noted that, not in fact "98% of scientists agree" with the assessment, so much as, according to individual assessments, there may be a 90% confidence rating (according to the IPCC), and that according to a American Geophysical Union (AGU) comprising two questions, basically do you think the temperatures have risen since the 1850's, and do you think human involvement was involved, which only some 80% responded with yes. However, climate change is not necessarily the same as global warming. A significant contribution, as compared to negligible, could be less than 1% considering what a massive impact it would be for humans to have affected the millions of years cycle. It should be noted that while humans have created roads, buildings, lights that practically blot out the earth when seen from space, turned land over into agriculture, wiped out, created and expanded multiple species, deforested, and created massive structures, this does not necessarily mean they have increased temperatures. In any case, "I heard that somebody heard that somebody heard" is not good evidence for scientific inquiries. 1000 years ago, many scientists "knew" the earth was flat, 500 years ago many people "knew" that the universe revolved around the earth, and 10 years ago we "knew" carbon dioxide was uniform throughout the atmosphere. Think of everything we'll know, tomorrow.
So far, there is little concrete proof that global warming is being caused by a predominately man made carbon dioxide driven greenhouse effect as presented by the IPCC and a few other organizations (although in the case of NASA and the EPA, not all members of the organization necessarily agree).[1][2][3] While carbon dioxide may be having some effect, there is little evidence to indicate that it's a significant issue, or predominately responsible for any of the arbitrary warming that may theoretically be occurring.
Global warming works in many ways, but the carbon dioxide driven greenhouse effect works by a small band or layer of carbon dioxide in the mid troposphere making the effect of carbon dioxide relatively stronger per unit than water vapor.[1][2][3] The Ozone layer blocks much of the UV that comes to earth; Visible radiation going through the earth's atmosphere, and a small amount of UV, is absorbed and reflected back up into the Atmosphere at much lower frequencies, or in the infrared zones, which is opaque to various greenhouse gases, including water vapor. Greenhouse gases absorb and then reflect or re-radiate the infrared radiation, which is produced by the earth from visible light and other forms of radiation, in nearly all directions, some of it down, back towards earth which prolongs it's time in the atmosphere and warms up the earth.[1][2][3] Water vapor is at roughly 20,000 parts per million in the atmosphere, compared to a maximum of 400 for carbon dioxide, or exists in over 50 times the volume as carbon dioxide. This makes the effect of carbon dioxide relatively minor in comparison to the effect of water vapor and other more powerful and present greenhouse gases, near the surface.[1][2][3][4][5]
However, the reality is somewhat more complex for the carbon dioxide driven greenhouse effect. The atmosphere near the surface is largely opaque to thermal or infrared radiation (with exceptions for "window" bands which let some of the heat through), and most heat loss from the surface is by sensible heat and latent heat transport, or more or less direct heat transfer. [1][2][3] Radiative energy losses become increasingly important higher in the atmosphere largely because of the decreasing concentration of water vapor, an important greenhouse gas. It is more realistic to think of the greenhouse effect, with carbon dioxide, as applying to a "surface" in the mid-troposphere, which is effectively coupled to the surface by a lapse rate. This particular area of carbon dioxide is far more important on the warming effect of the earth than it otherwise would be as a greenhouse gas due it's increased concentration and the lack of interaction from the water reflecting most of the infrared back down. [1][2][3][4][5][6]
If carbon dioxide does not reach this layer, which carbon dioxide produced from the surface, such as with cars and animals, rarely does, since it is denser than air and clumps at the surface and even in the mid troposphere,[1][2][3][4] it has little effect on this form of the greenhouse effect, making it relatively unimportant, which due to it's small amount in comparison, makes it relatively negligible. Unless the carbon dioxide reaches this layer in the mid troposphere, which being heavier than the atmosphere and clumping mostly to the surface due to the fact it does not diffuse through it, let alone uniformly, it's effect is relatively negligible. Due to the manner in which it reflects radiation back down, this surface in the mid troposphere is much thicker than is required to warm the earth; because so little gets past the surface of this band of carbon dioxide, increasing the thickness of this layer would also be mostly negligible in warming the surface. It's as a result of this that increasing carbon dioxide levels, produced from cars, fires, and other man made objects, are relatively insignificant.
In other words, the total volume of carbon dioxide is not the worry, but it's distribution. Since it does coat practically all of the mid troposphere, albeit unevenly, and with important exceptions for window bands considered, it reflects nearly all of the of radiation that can be reflected (predominately in the infrared spectrum) back down. This suggests that increasing levels of carbon dioxide will have a negligible impact; as long as there is a near complete cover of the earth's mid-tropospheric Atmosphere, even unevenly, forming a virtual wall, it will reflect most of the infrared back down; increasing levels do not change it's effects, as evidenced by how it operates and satellites which have proven no increase of temperatures over areas with higher carbon dioxide levels in their specific mid-tropospheric regions. In other words, while some areas have increased and decreased carbon dioxide levels, the levels do not seem to be affecting temperatures specifically nearly at all. [1][2][3][4]
Indeed, it was originally assumed carbon dioxide had a near even spread partially as a result of this near even infrared reflection. Areas with higher carbon dioxide levels do not tend to necessarily produce more heat. Areas over the equator tend to have less carbon dioxide than areas in temperate zones, yet their temperatures are often higher[1]. While important to the global warming cycle, relative power cannot be measured on a unit to unit basis; indeed, the carbon dioxide in the ocean and near the surface is considered less important than that of which is higher up in the atmosphere and that of which is in the mid troposphere. The importance of carbon dioxide in the greenhouse cycle is not dependent on amount, but distribution in the atmosphere; this also means that increasing the amount in important areas of distribution will likely have a negligible effect on warming. This means increasing levels, if they do increase, will likely have negligible effects except for surface increases, of which carbon dioxide is one of the weakest greenhouse gases in comparison to natural gas, nitrous oxide, and even water vapor in this form.
The amount we do produce is also being rapidly absorbed by trees, the ocean, and other waterborne carbon dioxide consuming creatures such as algae. NASA individuals, using the amount of carbon dioxide that is predicted to be produced by the IPCC, proved that it would at least have to be much cooler in the same given time frame, given how much carbon dioxide is likely to be absorbed by these sinks, even assuming it made it to the mid troposphere and amplified it's effects, which is unlikely. In a new paper in Geophysical Research Letters, NASA scientists estimate that doubling atmospheric carbon dioxide will result in 1.64 degrees Celsius of warming over the next 200 years, max. As stated by NASA the IPCC Did not allow the vegetation to increase its leaf density as a response to the physiological effects of increased CO2 and consequent changes in climate. Other assessments included these interactions but did not account for the vegetation down regulation to reduce plant’s photosynthetic activity and as such resulted in a weak vegetation negative response. According to NASA; "When we combine these interactions in climate simulations with 2 × CO2, the associated increase in precipitation contributes primarily to increase evapotranspiration rather than surface runoff, consistent with observations, and results in an additional cooling effect not fully accounted for in previous simulations with elevated CO2." [1][2][3][4][5][6]
There are also a lot more radiative energy losses in this carbon dioxide zone than has been suggested by the IPCC, as well. According to some individuals at NASA, it's significantly less. While the carbon dioxide reflects virtually all the infrared back down to earth, only about 50% of the radiation produced by the earth is infrared and a certain percentage of the energy is lost as latent and sensible heat, reducing it's effects, in addition to the fact that heat is absorbed by the atmosphere, which is then lost by it's expulsion. Even assuming an increase would have a significant effect, it would be much, much less, as a result. Even while NASA proclaims the importance of carbon dioxide in the warming cycle, it does not state that increasing it will have a significant effect. -??[1][Remote Sensing PDF]
Carbon dioxide levels taken from ice core drilling are routinely used to measure temperatures of previous ages. There is a connection between warm weather and carbon dioxide, but it is not the carbon dioxide causing the warming. When the oceans warm, the amount of carbon dioxide dissolved into them decreases, due to the fact that warmer waters cannot store as much carbon dioxide in them, much like how colder carbon dioxide drinks stay fizzier longer (warm drinks can, but they have to be held under pressure, which is why warm soft drinks often explode in the heat). As a result, carbon dioxide is released as a result of the oceans warming, which serves as a good measurement when relative measures are taken from ice core drilling to figure out carbon dioxide levels, influence by the ocean, since over 70% of the carbon dioxide released is from the ocean.[1][2][3][4][5][6] It should be noted that since carbon dioxide increases when it's warm, and not the other way around, that carbon dioxide released into the atmosphere does not exponentially warm the earth or else no cooling events would happen; if anything, the carbon dioxide release from the oceans after a warming event would seem to cool it down, since it has cooled since these times, compounding the issue of carbon dioxide being predominately responsible for the warming. If carbon dioxide did result in change, and the majority of it comes from the oceans based on their current temperatures, then where did the rest of the carbon dioxide come from? ...?
The oceans rising due to thermal expansion and the melting of the ice caps is also silly. Most ice in the oceans are stored under water[1][2][3], and water expands when frozen, suggesting that the ice melting under water, if anything, should decrease the ocean's levels. As well, the maximum density of water occurs at 3.98 °C (39.16 °F), while it expands while under 0 degrees Celsius, or while frozen. While the surface temperature is often some 60 degrees, the water beneath the surface makes up the most significant portion of water, and on average is some 0 °C (32 °F) to 3 °C (37 °F). This means that unless we have a sudden, extremely sharp change or increase in temperature, the ocean levels should actually decrease slightly from slightly increase heat, and not rise, if a significant change will occur at all, simply due to the vastness of the ocean and the somewhat irrelevant nature of atmospheric temperatures in relation. [1][2][3]
We do not have as many carbon dioxide producing chemicals as the IPCC States. Consumption, and therefore production of carbon dioxide, is expected to increase over 200 years. [1]
The world has roughly, in proven reserves, 1,324 billion barrels of oil, 300 trillion cubic meters of natural gas and 860 billion tons of coal.[1][2][3] The worldwide consumption of oil is roughly some 31.4 billion barrels per year, while worldwide consumption of natural gas is roughly 3.2 trillion cubic meters a year, and the worldwide consumption of coal is roughly 7.25 billion tonnes. At the current rate of consumption, this would mean running out of gasoline in 42 years, natural gas in 93.75, and coal in roughly 118 years. Gasoline represents some 40% of total fossil fuel consumption, and in 2008 energy by supply was oil 33.5%, coal 26.8%, gas 20.8%, out of the total energy consumption. [1]
The idea that we're going to increase temperatures substantially despite our lack of these primary carbon dioxide producing materials is mostly unfounded. If our consumption increased, by finding new forms of fossil fuels (which is possible, such as natural gas at the bottom of the ocean and other potential unfound or untapped reserves) perhaps it would be possible to extend this figure, but considering that global usage is expected to go down with a rationing of resources and improvements in fuel efficiency it's even further, less likely.
It should also be noted that warming is only occurring in key, isolated places, such as parts of Africa, Australia, and Alaska. Specific areas of the earth warming does not mean that the entire earth will warm, and effects over the whole earth from say, an area of 32 degrees suddenly turning to 34, are unlikely, since these areas will likely remain unaffected, since a total global average is increasing, but the entire earth is not warming equally.
Some Satellite Data
So, increasing surface carbon dioxide levels will have a negligible effect in warming the surface. Definitely not anything as high as a degree or so even if the next 100 years. However, if it were a result of a carbon dioxide driven greenhouse effect, we'd see a rise of temperature in the mid troposphere proportional to that on the surface. Do we?
Well, no. Satellites and weather balloons have documented little if any change[1][2][3][4]; even the IPCC's satellite documented little change[1][2]. The IPCC's official stance on the situation is there is "net spurious cooling". However, looking at the satellite data, it was possible to come up with a possible conclusion for why there seemed to be little if any change. It was possible that the orbital decay calculations on the satellite were off as it got closer to the atmosphere and eventually fall back through due to the earth's gravity increasing exponentially as it neared and due to atmospheric distortions from increased solar periods increasing UV and effects on the atmosphere. The problem with these calculations are, that orbital decay was already calculated for; assuming we were to recalculate this, the belief was that, as the satellite got closer to earth, the view of the satellite would be off due to the curvature of the earth. However, Microwave Sounding Unit data doesn't necessarily change with the angle of incidence to the earth, since there is a varied gigahert and aperture range. It's possible there would be less coverage of the earth, although this would simply present less data, and not necessarily a more negative trend (unless a series of coincidences were to occur). For all intents and purposes, if it did, it would suggest that the mid troposphere was warming more than it should have. As the angle of incidence increases with the earth, this would take the microwave sounding data longer to get to get from the satellite to the earth and back, given that the angle from the satellite to the ground would increase, hence increasing the length of the virtual hypotenuse. As a result, microwave data would take longer to get to the satellite, indicating what could be perceived as a longer hertz range, or a decrease in air pressure, which could be perceived the result of warming, and air expanding. (This impact would likely be negligible, however). In any case, this would require the orbital decay of the satellite to nearly have exactly matched the temperature change on the surface of the earth, proportionally, which has not been recorded by any other satellite, weather balloon, and would be increasingly improbable. Even if somehow it was slightly off in a perfect direction, with every satellite and weather balloon's temperature gauges perfectly slightly off to measure virtually the same temperature for random and various reasons, all evidence gathered to come to this conclusion would be scientifically and mathematically unfounded, suggesting a still unexplained cause for something that happened to effect every satellite and weather balloon equally, suggesting a far larger issue with a lack of the fundamental understanding of specific sciences that would compound the issue far beyond the scope of global warming, meaning global warming would be the least of our worries.
To directly compare MSU2R with radiosondes, a surface temperature layer is added to the radiosonde layers, and a vertical integration over all layers is done to compute and effective MUS2R trend of -0.02K per decade, instead of -0.05K, which is closer agreement with the observed +.07K per decade trend. Even so it still displays a negative trend; decreasing or not, essentially the aspect is, the mid tropospheric data is not complete nor indicative of being where global warming would suggest even over compensating for heat, which gives a much higher heat increase than would be expected, as well. Basically, the data does not suggest an increase in global warming as a result of the carbon dioxide in the mid troposphere, and potentially even records the opposite effect. [1][2][3]
While orbital decay could theoretically be compensated for, it does not negate the satellite data, as at best it is still cooling -.02K per decade, according to that data.
Even if the changes are "spurious", they could still exist, so the data should not be construed to reflect a predicted model, in any case.
The fallibility of Temperature Measurements
Correlation Between carbon dioxide and temperature; heat is going up, carbon dioxide is going up, therefore there must be a connection? While there might be, it seems to be rather inconsistent. Should it be atmospheric warming, it should be even and a direct result of increased carbon dioxide, but the figures are relatively random. [1][2][3] While many "positive" links have been asserted, they have not in fact, proven a direct correlation with carbon dioxide, which if it is carbon dioxide, there theoretically should be. Carbon dioxide increases, earth heats up X amount; supposedly. But what the data shows, more or less, is no direct connection between heat and carbon dioxide. There are wild and variable temperature changes, even over long periods of time, but carbon dioxide has increased steadily, without a steady increase in temperature, even if it can be average. If climatologists know what they are talking about, then on another planet, like earth, say some 10 degrees cooler, how much would X degree of carbon dioxide increase that planet's temperature? The fact of the matter is, all that's been measured is an increase in temperature, and a theoretical increase in carbon dioxide, and if those two correlate the same, then X amount of temperature increase could be expected. However, the temperature may have increased regardless of carbon dioxide, due to other factors, other greenhouse gases and may even simply have been arbitrary or a slow warming as getting out of the ice age.
It should also be noted that most warming has occurred in the last 20 years, that is calculated within the 100 year data. This has also been surface temperature increases, and not necessarily atmospheric increases. While the data presents various ups and downs that easily factor out, it is only a result of these last 20 years that we see massive increases in temperatures. It may simply be that the last 20 years have been unusually warm, with no real direction connection to human activities. Measuring the earth's average temperature when it's been the hottest it has been in the last 1300 years, as a baseline for global warming, may indeed produce a biased result.
Additionally, we are just out of a "little ice age". [1][2][3] Temperatures, from roughly 1300-1850 A.D., were around 1°C cooler. If the earth was warming, this would be consistent with re-normalizing to regular trends, and wouldn't denote any significant increase afterwards. Many theories exist as to why this occurred, many more suggest it was potentially localized in specific areas, but possibly the one that ought to be considered the most is the arbitrary variability and fluctuations of weather. Even according to the IPCC, the 1 degree difference was rather "modest" and probably was ineffectual, suggesting a recent warming may be just the same, as well.
Weather balloons didn't begin to monitor weather until about 1896, by a single French Meteorologist,[1] and didn't become accurate, stable or consistent until the 1950's. The first satellite in space, the Soviet Union launched Sputnik, first reached space in 1957. Weather monitoring did not occur until some time after this. The world meteorological society was not produced until the 1950's, The International Meteorological society, (IMO), which was founded in 1873. Antarctica, Alaska and other important places did not begin substantial weather monitoring until some time later. The notion that surface temperature measurements in the 1850's are a "good enough" measurement, when such information would be disregarded completely if taken today, without accurate atmospheric measures, different altitude measurements and a multitude of other factors, is silly.
While ice core drilling has shown correlations between carbon dioxide and temperature, this is most likely a lag in production. When the oceans, or pretty much any water warms, they release carbon dioxide; when they cool, they absorb more carbon dioxide. As the oceans warm, they will release more carbon dioxide, and vice versa; a little bit of this carbon dioxide is often trapped in ice, revealing relative carbon dioxide levels of a given timeframe. This means that, likely, most of the carbon dioxide found during warm temperatures is likely a result of warmer oceans, and not the other way around.[1][2] Compounding the issue, of carbon dioxide was the primary cause of an increase in global temperatures, than as they increased, increasing the carbon dioxide levels even further, the earth would have never cooled back down, which is has considerably since these times. It should also be noted that levels of carbon dioxide could also theoretically be times of great cold, in accordance with increased volcanic activity, which can cool down the earth.
This also means that the issue involving "the most carbon dioxide in 650,000 years"[1][2][3] could likely be explained by the fact the ice age began at the Pleistocene Epoch some 2.6 million years, and it has consistently gotten warmer and the carbon dioxide levels have risen from 10,000 years ago.
How even if true it's not the whole world evenly, so it's obviously not an equal effect from carbon dioxide. Carbon dioxide levels are lower in the tropics and at the equator than in most temperate zones, yet it is substantially warmer in these areas. It's likely the amount of carbon dioxide has little if any variation on these temperatures. This is due to the band of carbon dioxide existing beyond a certain amount having the impact; the amount of carbon dioxide does not matter, but it's distribution, and if it coats the mid troposphere entirely, albeit unevenly, the same general greenhouse effect will occur equally by reflecting radiation back down towards the earth, but water vapor and other greenhouse gases will determine how much is absorbed.
But what of Venus?
Venus has some odd 96.5% of their atmosphere being carbon dioxide, while at max earth is some 0.038% carbon dioxide (or 380 ppm). This automatically produces a 2540 times difference of carbon dioxide between Earth and Venus; considering that it's atmosphere is some 91 times denser, this puts the carbon dioxide levels at roughly 228,552, or 230,000 times more than earth. Assuming Venus is 800 degrees warmer (which it is less than this), this would only mean a .00347 increase in temperature for doubling the earth's carbon dioxide.[1]
It should be noted however that Venus's carbon dioxide likely came after the warming effect, when the oceans evaporated and dissociated into hydrogen and oxygen due to solar radiation, and that very little light reaches Venus's surface since it is reflected, the atmosphere is incredibly thick, and the impact of carbon dioxide is minimal.
Not that Venus is necessarily a good analogue for earth.
What to take from This
Global warming, being a significant trend to worry about, may be false; it's likely the earth is warming due to arbitrary weather patterns that cycle without much impact from human activity, if it's warming much at all.
However, gasoline and other fossil fuels are expensive and increasingly harder to get ahold of. Mercury levels in the ocean, predominately a result of burning coal, are so high that the FDA recommends lowering the amount of fish people eat due to the fear of mercury poisoning and mercury build up. All the pollution and materials we create go into the atmosphere to be breathed in and rained back down and absorbed into drinking water and habitats of animals we consume, not only hurting our ecosystem but potentially ourselves, as well.
We have maybe 40-60 years worth of cheaply available gasoline left, and 120 years of coal, at the current rate of consumption, which our rate of consumption is set to increase in the future, to potentially double these levels by 2050, compounding what little fuel we may have left by this time. If we don't switch our fuel supplies over to cheaper, less polluting and more available options, such as Thorium or burning gasoline in steam turbines and then using algae to capture the exhaust, for improved efficiency and safety, we may all suffer, economically, strategically, and with our health.
Even if global warming is untrue there is no detriment to improving our current energy situation and potentially having energy independence, potentially in the U.S. or country of origin, to be self reliant and not rely on foreign intervention or resources.
If the globe is warming, whether arbitrarily or by a result of some other mechanism, it is still important to understand this so we can understand the effects.
Another important thing to consider is that the scientific institutions purporting global warming are not necessarily wrong. They may have proposed ideas, but it was only because of the evidence they provided that it was capable to potentially prove them wrong.The assessment of isolated individuals within these institutions going off of raw numbers is a potentially valid figure for what those figures would produce, however, when considering variables, such as the current temperatures of the oceans, their vastness, the method of carbon dioxide's warming, rather than equating an increase in heat or change based on a raw unit to unit variable, a more clear picture becomes available, and we advance our scientific understanding of the world.
It should also be noted that, not in fact "98% of scientists agree" with the assessment, so much as, according to individual assessments, there may be a 90% confidence rating (according to the IPCC), and that according to a American Geophysical Union (AGU) comprising two questions, basically do you think the temperatures have risen since the 1850's, and do you think human involvement was involved, which only some 80% responded with yes. However, climate change is not necessarily the same as global warming. A significant contribution, as compared to negligible, could be less than 1% considering what a massive impact it would be for humans to have affected the millions of years cycle. It should be noted that while humans have created roads, buildings, lights that practically blot out the earth when seen from space, turned land over into agriculture, wiped out, created and expanded multiple species, deforested, and created massive structures, this does not necessarily mean they have increased temperatures. In any case, "I heard that somebody heard that somebody heard" is not good evidence for scientific inquiries. 1000 years ago, many scientists "knew" the earth was flat, 500 years ago many people "knew" that the universe revolved around the earth, and 10 years ago we "knew" carbon dioxide was uniform throughout the atmosphere. Think of everything we'll know, tomorrow.
Saturday, August 25, 2012
Possible Capabilities with Gasoline
Gasoline for Electricity Generation
Gasoline is an incredibly powerful fuel source, possessing around 21.5 million joules per pound, compared to coal at around 11 million joules. It’s currently used in automobiles, planes, and other vehicles due to its raw power and high energy density. However, a lot of gasoline is currently wasted; the efficiency of many automobiles is lacking as much of the gasoline is wasted generating heat. A typical internal combustion engine only possesses about 26% thermal efficiency and only about 20% mechanical efficiency.
Comparatively, steam turbines generally have around 35-40% efficiency, and are capable of getting around 90% fuel efficiency, meaning that if gasoline was burned to power specially designed steam turbines they could easily be around 3-4 times more efficient in terms of energy production, and potentially up to 4.5 times more efficient, than standard combustion engines. If this energy was stored as electricity, this means that the energy, over the power grid and through cars, could easily be 3 times more efficient when used in vehicles after the energy is generated with multi-million dollar steam turbines rather than few thousand dollar engines. This would mean that we would not only get three times as much energy from gasoline as we do now, but that the cost of gasoline would essentially drop by 1/3 its current amount, given our reduced need for it. If the energy was generated in a more efficient machine before being used in vehicles one could easily reduce the price of locomotion drastically as well as reduce their polluting emissions.
Algae could be used to capture the exhaust from the steam turbines to prevent it from going into the atmosphere, and then use that algae to produce ethanol. As of now the standard 5-10% ethanol gasoline fuel blend currently used in most unleaded standard gasoline seems to possess the equivalent fuel efficiency of gasoline, despite ethanol being around 1/3 weaker than gasoline. This means that burning ethanol in addition to gasoline in the right concentration seems to keep its power level at the same level as straight gasoline, and is basically like adding 10% free fuel to the mixture simply by reusing previously discarded waste products.
It would also make self-reliance on energy a much more feasible task. The United States in 2004 imported nearly 65% of its oil from other countries, and this was considered the peak import year for the 2000 onward period (foreign oil usage is expected to drop to 54% by 2030). If the efficiency of the United States’ use of oil was increased by just 3 times its current amount, all the gasoline used in the United States could come from local sources. This means that a dependency on foreign imported oil, some of this oil that could potentially come from questionable sources, would be eliminated and the United States’ energy supply needed for daily expenses and even potentially economic prowess could be entirely in its own hands.
The price of practically everything could fall (given the current transport system that involves using gasoline), pollution could be virtually eliminated and various Countries’ dependence on foreign oil could be removed, allowing them to prosper without the hands of countries’ whose democratic values may not be the same as theirs.
While we would still be reliant on gasoline, the said process would create a lot less pollution and would provide vastly more energy for no foreseeable increase in cost in regards to fuel consumption, being a good option for all of the United States’ and other affiliated countries’ energy needs.
While Carbon Fiber is expensive, the addition of Thorium or reduced energy costs, at least enough to lower carbon fiber from 15-16 dollars per pound to 5 dollars per pound, being slightly less than 1/5th the density of of steel (1.5 grams per cubic centimeter compared to 7.85), would put carbon fiber unit to unit about as expensive as steel, making it more feasible for production while keeping the same level of safety. This could be possible with improved efficiency of electricity production, and if the electric cars were lighter weight, they could not only travel further (eliminating range anxiety) but the life of the engine and battery could be lengthened. Potential replacements to lithium include Potassium ion and lithium titanate both of which have longer lives than lithium ion and have potential advantages and disadvantages, namely potassium ion being cheap and lithium titanate being able to recharge faster.
Gasoline is an incredibly powerful fuel source, possessing around 21.5 million joules per pound, compared to coal at around 11 million joules. It’s currently used in automobiles, planes, and other vehicles due to its raw power and high energy density. However, a lot of gasoline is currently wasted; the efficiency of many automobiles is lacking as much of the gasoline is wasted generating heat. A typical internal combustion engine only possesses about 26% thermal efficiency and only about 20% mechanical efficiency.
Comparatively, steam turbines generally have around 35-40% efficiency, and are capable of getting around 90% fuel efficiency, meaning that if gasoline was burned to power specially designed steam turbines they could easily be around 3-4 times more efficient in terms of energy production, and potentially up to 4.5 times more efficient, than standard combustion engines. If this energy was stored as electricity, this means that the energy, over the power grid and through cars, could easily be 3 times more efficient when used in vehicles after the energy is generated with multi-million dollar steam turbines rather than few thousand dollar engines. This would mean that we would not only get three times as much energy from gasoline as we do now, but that the cost of gasoline would essentially drop by 1/3 its current amount, given our reduced need for it. If the energy was generated in a more efficient machine before being used in vehicles one could easily reduce the price of locomotion drastically as well as reduce their polluting emissions.
Algae could be used to capture the exhaust from the steam turbines to prevent it from going into the atmosphere, and then use that algae to produce ethanol. As of now the standard 5-10% ethanol gasoline fuel blend currently used in most unleaded standard gasoline seems to possess the equivalent fuel efficiency of gasoline, despite ethanol being around 1/3 weaker than gasoline. This means that burning ethanol in addition to gasoline in the right concentration seems to keep its power level at the same level as straight gasoline, and is basically like adding 10% free fuel to the mixture simply by reusing previously discarded waste products.
It would also make self-reliance on energy a much more feasible task. The United States in 2004 imported nearly 65% of its oil from other countries, and this was considered the peak import year for the 2000 onward period (foreign oil usage is expected to drop to 54% by 2030). If the efficiency of the United States’ use of oil was increased by just 3 times its current amount, all the gasoline used in the United States could come from local sources. This means that a dependency on foreign imported oil, some of this oil that could potentially come from questionable sources, would be eliminated and the United States’ energy supply needed for daily expenses and even potentially economic prowess could be entirely in its own hands.
The price of practically everything could fall (given the current transport system that involves using gasoline), pollution could be virtually eliminated and various Countries’ dependence on foreign oil could be removed, allowing them to prosper without the hands of countries’ whose democratic values may not be the same as theirs.
While we would still be reliant on gasoline, the said process would create a lot less pollution and would provide vastly more energy for no foreseeable increase in cost in regards to fuel consumption, being a good option for all of the United States’ and other affiliated countries’ energy needs.
While Carbon Fiber is expensive, the addition of Thorium or reduced energy costs, at least enough to lower carbon fiber from 15-16 dollars per pound to 5 dollars per pound, being slightly less than 1/5th the density of of steel (1.5 grams per cubic centimeter compared to 7.85), would put carbon fiber unit to unit about as expensive as steel, making it more feasible for production while keeping the same level of safety. This could be possible with improved efficiency of electricity production, and if the electric cars were lighter weight, they could not only travel further (eliminating range anxiety) but the life of the engine and battery could be lengthened. Potential replacements to lithium include Potassium ion and lithium titanate both of which have longer lives than lithium ion and have potential advantages and disadvantages, namely potassium ion being cheap and lithium titanate being able to recharge faster.
Ways to improve the Standard U.S. firearm
Current state of Affairs
In August 2010 the Individual Carbine Competition was formed to help provide Infantry with a newer, more modern weapon, and was cancelled March 19 2013, due to budget concerns, of which were about 1.8 billion dollars. [1][2] While this is silly for many reasons, the general basis will be discussed below.
General Background
The M16 was officially adopted in 1969, although it saw use in the Vietnam war as early as 1963. The firearm has been the primary weapon of the U.S. for almost 43 years as of 2013, and has officially been in service for at least 50 years. Comparatively, M1903 Springfield was in service for 34 years, from 1903-1937, the M1 Garand was in service for 23 years, from 1936 to 1959, and the M14 for 11 years, from 1959 to 1970. As technologically progressed, it was generally assumed that firearms would need to be replaced progressively faster, as the rate of technological advancement tends to increase exponentially, consequentially faster and faster rate of development to require frequent replacements over the years. Despite this, the M16 has been the primary U.S. firearm, technically, for 43 years, and was widely used in Vietnam, before it's adoption in 1969 (the Vietnam war lasted from 1955-1975). Thus the weapon has been in service for an incredibly long time, twice as long as the m1 garand and almost 5 times as long as the M14, despite the fact technologically continues to rapidly progress.
At it's conception, the weapon was plagued with problems. The U.S. military initially refused to adopt any rounds weaker than the .30-06 or 7.62mm x 51mm NATO round, such as the .280 british round, prompting NATO to require all allied countries to use a similiar, more powerful round. Despite this initial decision, intermediate cartridges have achieved substantial success, as they have become the norm in NATO countries (the 5.56mm cartridge, in particular). The AR-10, also designed by Eugine stoner, was more accurate, lighter weight, had less recoil than the previous M1 garand or soon to be adopted M14, and was generally as reliable as the M14. Despite this, the weapon was not chosen by the U.S. military. More or less, a more conventional weapon, similiar to the M1 garand, was desired, and so ultimately the M14 was adopted. Rather than utilize plastics or fiberglass, or other materials available in other weapons, and utilized a traditional wooden frame, was traditionally layed out (instead of bullpup). It sported a number of improvements, utilizing a 20 round detachable box magazine instead of an 8 round en bloc magazine, and being fed from the bottom instead of the top, allowing for easier feeding and an enlarged magazine. The weapon was capable of firing in full auto, similiar to the Browning Automatic Rifle (a light machine gun), but was considerably lighter and shorter, and was deployed to nearly all infantry units.
However, many problems emerged; the M14, while much lighter than a BAR, and more similiar to the M1 garand, was considered to have too much recoil to function adequately. Even with it's weaker (3500 joules compared to about 4000), lighter, and smaller round (3.35 inches compared to 2.75 inches), the weapon still produced substantial, and sometimes uncontrollable fully automatic fire. The fear was so great that M14's sent to Vietnam had their fully automatic firing selectors welded closed, to prevent the weapon from firing fully automatic. (A Report in X said that the side that fired the most bullets) As a result, the U.S. was ready and willing to replace their standard firearm with a more modern one, citing issues with weight, size, and recoil as significant concerns.
The obvious choice would have been the previous candidates, the EM-2 or AR-10. The EM-2 was a bull-pup weapon, without a long stock, that was able to shorten the over-all length the weapon, when compared to the M14 or M1 garand, by around 10 inches. The weapon was only 35 inches long and about 7.8 pounds with a 24.5 inch barrel, and could have been 30.5 inches with a 20 inch barrel (standard on the M16 today.) It fired a weaker round, that could be stored in larger magazines (from 8 rounds to 20), and generally had less recoil, although it was just as accurate and followed a similiar ballistic trajectory as the .30-06. The 7mm round was more aerodynamic, and had a higher ballistic efficiency, allowing it to have almost as much power as the .30-06 at long ranges. At close ranges, the weapon was generally considered sufficient in terms of stopping power, being 9 grams and traveling at 770 m/s, with 2680 joules, compared to the Ak-47 with 8 grams and 715 m/s, and 2000 joules, meaning it had more power than the competing enemy weapon (known for reliability and firepower), but significantly higher accuracy.
The Ar-10 on the other hand was a much larger, traditionally layed out weapon. Unloaded, the weapon had a similiar weight of about 7.5 pounds, but was over 6 inches longer, and had a 4 inch shorter barrel (comparatively, the EM-2 could have been over 10 inches shorter with a same sized barrel). This increased over-all length gave the disadvantage of being more cumbersome and harder to use in close quarters, as well as harder to aim. It did however, have substantially more firepower, utilizing a .30-06, with similiar accuracy, and had substantially less recoil than the M14 or M1 garand, meaning the weapon could fire in full auto largely without issues. The choice was largely between a slightly longer but more powerful weapon, or a shorter, but just as accurate and long ranged weapon, that had less (but generally sufficient) power. The AR-10 was longer but as a result of it's unique firing system could decrease recoil but lengthened the over-all weapon, while the EM-2 was shorter, but had to use weaker, smaller rounds to be easy to control.
In the end,
On the aspect of Cost
The individual carbine competition is being shut down largely due to cost, which may again limit U.S. troops from getting a superior firearm. The entire competition is expected to cost 1.8 billion dollars, with acquiring new weapons to be around the same cost of procurement for M16's. Even assuming the replacement weapon was more expensive, the issue is of cost is largely insignificant for many reasons. The U.S. military does not spend that much money on firearms, compared to their total expenditures, despite the fact that all service men have a need for a personal defense weapon, which could mean life or death in many situations; a more reliable, and consequently more durable weapon, may last longer, therefore decreasing over all costs, as well.
The over-all cost of firearms are not very high, relatively speaking. At a thousand dollars each, a million U.S. firearms would be about 1 billion dollars; 10 million would be 10 billion, and so forth. An extra barrel and accessories can add about 600 dollars, so 6 billion more dollars on top of that 10 billion, for 10 million firearms. There are, presumably, only roughly 8 million M16 type firearms worlwide. Never the less, assuming the U.S. bought 10 million, this would cost roughly 16 billion dollars. Over the 50 year life of the weapon, this equates to roughly 320 million dollars per year, or over the official 43 year life span 380 million, roughly half a B-2 bomber. Although the firearms' price have increased with inflation, at today's cost this would be about 1/2200th of the military's annual budget, or 1/43rd the annual budget in a single down payment (1/430th over a 10 year time frame).
Comparatively the SR-25 is around 4000 dollars per unit. It is extremely accurate, with out of the box accuracy being around .5-1 MOA, or around that of a standard M40 sniper rifle. Despite this, it is semi-automatic, of instead of manually operated, and has a 20 round magazine, with around the same recoil as an M16 despite firing a substantially more powerful round. It is also around the same weight and has similiar ergonomics, lasting about as long as the weapon and it's barrel, without the need for a replacement barrel. Being more expensive, around 2.6 times more, this would only be about 40 billion dollars for 10 million, or 4 billion per year over 10 years, out of a 700 billion dollar budget, easily able to replace the average soldier's firearm. For every individual person, medics, officers, even your average person who might need to pick up a weapon and defend themselves, for just 4 billion a year you could arm each of the 3 million service members with over 3 sniper grade semi-automatic rifles. While it might not necessarily be the best firearm for our soldiers, even one of the best sniper rifles in our arsenal would be fairly inexpensive, in terms of over-all cost. Therefore the weapon could be 10 times more expensive, at around 15,000 dollars per year, and probably be insignificant in terms of over-all cost to the military, in regards to what it would it provide for every service member.
Even assuming cost is a factor, generally higher quality, and more reliable weapons, such as the FN SCAR, HK416, or XM8 tend to be more
Ways to improve the Standard U.S. firearm
In August 2010 the Individual Carbine Competition was formed to help provide Infantry with a newer, more modern weapon, and was cancelled March 19 2013, due to budget concerns, of which were about 1.8 billion dollars. [1][2] While this is silly for many reasons, the general basis will be discussed below.
General Background
The M16 was officially adopted in 1969, although it saw use in the Vietnam war as early as 1963. The firearm has been the primary weapon of the U.S. for almost 43 years as of 2013, and has officially been in service for at least 50 years. Comparatively, M1903 Springfield was in service for 34 years, from 1903-1937, the M1 Garand was in service for 23 years, from 1936 to 1959, and the M14 for 11 years, from 1959 to 1970. As technologically progressed, it was generally assumed that firearms would need to be replaced progressively faster, as the rate of technological advancement tends to increase exponentially, consequentially faster and faster rate of development to require frequent replacements over the years. Despite this, the M16 has been the primary U.S. firearm, technically, for 43 years, and was widely used in Vietnam, before it's adoption in 1969 (the Vietnam war lasted from 1955-1975). Thus the weapon has been in service for an incredibly long time, twice as long as the m1 garand and almost 5 times as long as the M14, despite the fact technologically continues to rapidly progress.
At it's conception, the weapon was plagued with problems. The U.S. military initially refused to adopt any rounds weaker than the .30-06 or 7.62mm x 51mm NATO round, such as the .280 british round, prompting NATO to require all allied countries to use a similiar, more powerful round. Despite this initial decision, intermediate cartridges have achieved substantial success, as they have become the norm in NATO countries (the 5.56mm cartridge, in particular). The AR-10, also designed by Eugine stoner, was more accurate, lighter weight, had less recoil than the previous M1 garand or soon to be adopted M14, and was generally as reliable as the M14. Despite this, the weapon was not chosen by the U.S. military. More or less, a more conventional weapon, similiar to the M1 garand, was desired, and so ultimately the M14 was adopted. Rather than utilize plastics or fiberglass, or other materials available in other weapons, and utilized a traditional wooden frame, was traditionally layed out (instead of bullpup). It sported a number of improvements, utilizing a 20 round detachable box magazine instead of an 8 round en bloc magazine, and being fed from the bottom instead of the top, allowing for easier feeding and an enlarged magazine. The weapon was capable of firing in full auto, similiar to the Browning Automatic Rifle (a light machine gun), but was considerably lighter and shorter, and was deployed to nearly all infantry units.
However, many problems emerged; the M14, while much lighter than a BAR, and more similiar to the M1 garand, was considered to have too much recoil to function adequately. Even with it's weaker (3500 joules compared to about 4000), lighter, and smaller round (3.35 inches compared to 2.75 inches), the weapon still produced substantial, and sometimes uncontrollable fully automatic fire. The fear was so great that M14's sent to Vietnam had their fully automatic firing selectors welded closed, to prevent the weapon from firing fully automatic. (A Report in X said that the side that fired the most bullets) As a result, the U.S. was ready and willing to replace their standard firearm with a more modern one, citing issues with weight, size, and recoil as significant concerns.
The obvious choice would have been the previous candidates, the EM-2 or AR-10. The EM-2 was a bull-pup weapon, without a long stock, that was able to shorten the over-all length the weapon, when compared to the M14 or M1 garand, by around 10 inches. The weapon was only 35 inches long and about 7.8 pounds with a 24.5 inch barrel, and could have been 30.5 inches with a 20 inch barrel (standard on the M16 today.) It fired a weaker round, that could be stored in larger magazines (from 8 rounds to 20), and generally had less recoil, although it was just as accurate and followed a similiar ballistic trajectory as the .30-06. The 7mm round was more aerodynamic, and had a higher ballistic efficiency, allowing it to have almost as much power as the .30-06 at long ranges. At close ranges, the weapon was generally considered sufficient in terms of stopping power, being 9 grams and traveling at 770 m/s, with 2680 joules, compared to the Ak-47 with 8 grams and 715 m/s, and 2000 joules, meaning it had more power than the competing enemy weapon (known for reliability and firepower), but significantly higher accuracy.
The Ar-10 on the other hand was a much larger, traditionally layed out weapon. Unloaded, the weapon had a similiar weight of about 7.5 pounds, but was over 6 inches longer, and had a 4 inch shorter barrel (comparatively, the EM-2 could have been over 10 inches shorter with a same sized barrel). This increased over-all length gave the disadvantage of being more cumbersome and harder to use in close quarters, as well as harder to aim. It did however, have substantially more firepower, utilizing a .30-06, with similiar accuracy, and had substantially less recoil than the M14 or M1 garand, meaning the weapon could fire in full auto largely without issues. The choice was largely between a slightly longer but more powerful weapon, or a shorter, but just as accurate and long ranged weapon, that had less (but generally sufficient) power. The AR-10 was longer but as a result of it's unique firing system could decrease recoil but lengthened the over-all weapon, while the EM-2 was shorter, but had to use weaker, smaller rounds to be easy to control.
In the end,
On the aspect of Cost
The individual carbine competition is being shut down largely due to cost, which may again limit U.S. troops from getting a superior firearm. The entire competition is expected to cost 1.8 billion dollars, with acquiring new weapons to be around the same cost of procurement for M16's. Even assuming the replacement weapon was more expensive, the issue is of cost is largely insignificant for many reasons. The U.S. military does not spend that much money on firearms, compared to their total expenditures, despite the fact that all service men have a need for a personal defense weapon, which could mean life or death in many situations; a more reliable, and consequently more durable weapon, may last longer, therefore decreasing over all costs, as well.
The over-all cost of firearms are not very high, relatively speaking. At a thousand dollars each, a million U.S. firearms would be about 1 billion dollars; 10 million would be 10 billion, and so forth. An extra barrel and accessories can add about 600 dollars, so 6 billion more dollars on top of that 10 billion, for 10 million firearms. There are, presumably, only roughly 8 million M16 type firearms worlwide. Never the less, assuming the U.S. bought 10 million, this would cost roughly 16 billion dollars. Over the 50 year life of the weapon, this equates to roughly 320 million dollars per year, or over the official 43 year life span 380 million, roughly half a B-2 bomber. Although the firearms' price have increased with inflation, at today's cost this would be about 1/2200th of the military's annual budget, or 1/43rd the annual budget in a single down payment (1/430th over a 10 year time frame).
Comparatively the SR-25 is around 4000 dollars per unit. It is extremely accurate, with out of the box accuracy being around .5-1 MOA, or around that of a standard M40 sniper rifle. Despite this, it is semi-automatic, of instead of manually operated, and has a 20 round magazine, with around the same recoil as an M16 despite firing a substantially more powerful round. It is also around the same weight and has similiar ergonomics, lasting about as long as the weapon and it's barrel, without the need for a replacement barrel. Being more expensive, around 2.6 times more, this would only be about 40 billion dollars for 10 million, or 4 billion per year over 10 years, out of a 700 billion dollar budget, easily able to replace the average soldier's firearm. For every individual person, medics, officers, even your average person who might need to pick up a weapon and defend themselves, for just 4 billion a year you could arm each of the 3 million service members with over 3 sniper grade semi-automatic rifles. While it might not necessarily be the best firearm for our soldiers, even one of the best sniper rifles in our arsenal would be fairly inexpensive, in terms of over-all cost. Therefore the weapon could be 10 times more expensive, at around 15,000 dollars per year, and probably be insignificant in terms of over-all cost to the military, in regards to what it would it provide for every service member.
Even assuming cost is a factor, generally higher quality, and more reliable weapons, such as the FN SCAR, HK416, or XM8 tend to be more
Ways to improve the Standard U.S. firearm
Body Armor and how we could do better
Increasing Body armor Capabilities
On the Aspect of Cost
The United States's military spent roughly $300 million dollars on interceptor body armor in 2004; an Interceptor system costs roughly $1,585.Theoretically, that equates to roughly 189,274 units or around 190,000 interceptor units; to my knowledge more armor has been purchased since then but I'm not entirely aware how many have been obtained or how much each vest costs.
60,000 MTV and IMTV (improved, modular tactical vest) vests have been purchased by the marines, and another 28,364 vests have been purchased by the navy. Assuming roughly the same price as the Interceptor, around 140 million extra dollars wpi;d have been spent on body armor since august 2008. Both of these have predominately been for the Afghanistan and Iraq efforts.
The soft body armor is at roughly NIJ IIIA rating, or capable of stopping 9mm rounds, .44 Magnums or anything below, but not rifle rounds.
This means that, annually, over the past 10 years, on average, the military has spent roughly 45 million dollars (to my knowledge) on body armor, most of which was for, presumably, the Iraq and Afghanistan effort. Even if it was 70 million dollars on average, annually, That would still be 1/10,000th of the nearly 700 billion dollar budget the military receives. The armor, worn by our primary troops, and individual units, human beings, people, is less than 1/10,000th of our total spending; and that assumes I've underestimated the guess by 35 million dollars a year. This equals roughly 3.5 Apache helicopters, half an F35, or roughly 1/10th of the cost of a single B-2 bomber. The armor worn by medics, tank crews, soldiers on foot, repair crew, practically everyone that can benefit from personal defense, is only given 1/10,000th of the total budget, practically nothing in comparison to our other purchases.
I actually like how much we spend on these expensive vehicles however, and I believe it wholly necessary given their capabilities; I'm not suggesting we getting rid of a B-2 bomber or half an F-35 in order to procure more body armor, but I think the relative costs put the issue of cost into perspective. The military could afford better body armor for it's primary troop scape (about 200,000 people deployed "over seas") for the equivalent of pennies of it's annual costs. The cost of body armor at the moment is nearly negligible, meaning we could theoretically spend gratuitous amounts on body armor, and yet the armor has trouble stopping rifle rounds. Cost is not a significant factor in my opinion, and if there was a more effective option available, say 10 times more expensive, or roughly 15,000 dollars per vest, it's over-all costs would be nothing compared to it's potential impacts (such as actually making our soldiers rifle resistant, opening up a whole new line of capabilities, and protection), or even the medical bills we currently pay, as horrific and awful as that is.
Another option I'm not exactly suggesting, but that will also hopefully give a sense of scale, is nanocomp armor CNT (carbon nano tube) armor. An example of Nano-comp CNT, that currently exists, is a 2mm piece of material, roughly the thickness of a few business cards, that is capable of stopping a 9mm round without significant penetration or destruction of the material. The company says it's targeting growth and maturation akin to carbon fiber, with a target material cost of $350/kg to $400/kg. I'm not sure if it's capable of stopping more rounds, what level of 9mm rounds (fired at higher velocities), or capable of withstanding repeated impacts, but this puts the armor at roughly II armor capabilities assuming minimum capabilities. Comparable Kevlar armor vests of this armor rating are generally around 6mm in thickness.
This makes the nano comp armor roughly 3 times stronger than Kevlar on a thickness to thickness ratio, assuming minimum capabilities. It's possible to assume it has roughly the same mass as Kevlar at this thickness, although it is possible it's slightly less, since carbon nanotubes are around 1.3-1.4 grams per cubic centimeter while Kevlar is around 1.4 The target cost is around 350-400 dollars per kilogram, so each pound is likely 160-180 dollars. Let's just assume it's going to be 500 dollars per kilogram, or 230 dollars per pound, since this is only a predicted mass produced price, and it could be more. A standard 16 lb interceptor vest (made out of this material) would be around 3600 dollars or 2.3 times it's current price. This would have equated in a 690 million dollar cost for 192,000 vests, instead of 300 million, a negligible cost considering the billions spent annually. Perhaps almost equal to a B2 bomber (not that I don't support B2 bombers, I believe their existence and use is necessary). And this is assuming that the armor is 3 times as powerful, which it is likely 4, this is assuming the cost is 500 dollars per kilogram, which it is likely 350-400 (390-560 million dollars), and that there aren't potentially better nanocomp type materials already available. This would mean tripling the effectiveness of modern body armor assuming minimum capacities, for a practically negligible cost.
Although it's easy to see how such an armor system might not appear available. These products are only almost within mass production and their effectiveness has yet to face the rigor of the military or military testing, so perhaps unknown or untested complications will arise.
Something I do believe in supporting immediately is M5 fiber. M5 fiber is about as strong as Kevlar, at around half the thickness and weight, that is elastic, much more heat tolerant, and a very chemically stable material, meaning it's likelihood to break down due to UV, chemicals or heat is relatively low. It's heat tolerance is supposedly above that of nomex, making it potentially better than what current firefighters wear and also potentially useful against warm environments as general armor. Being an Aramid, it carries roughly the same properties as Kevlar, although the elasticity may make it over-all more durable and even potentially more capable for repeated impacts given it's ability to yield when stretch, rather than being permanently stretched and therefore weakened.
Liquid body armor, or nano silica particles suspended in ethylene glycol after being baked in an oven, is known to increase Kevlar's Strength by roughly 3.5 times it's current amount. The material is a non-newtonian fluid, meaning that it behaves like a solid under stress or impact, like corn starch. Four layers of STF-treated Kevlar can dissipate the same amount of energy as 14 layers of neat Kevlar (or 3.5 times, although it has worse performance against knives and low velocity impacts). In addition, STF-treated fibers don't stretch as far on impact as ordinary fibers, meaning that bullets don't penetrate as deeply into the armor or transfer as much blunt force trauma to the target; the researchers theorize that this is because it takes more energy for the bullet to stretch the STF-treated fibers. This means that, potentially, M5 fiber could be flat out 7 times stronger than kevlar on a mass to mass basis, and roughly on a thickness basis; this means that a typical interceptor vest could be 7-8 times more powerful, or powerful enough to warrant fully body armor or more body armor that's lighter but still retains more or equal strength. Combined with increased multiple shot protection due to increased stretch resistant and stretching capability, the armor may provide legitimate multiple round protection against extremely powerful rounds, and particularly against higher velocity rounds, potentially rifle rounds. Both liquid body armor and M5 fiber have higher protection against higher velocity rounds, potentially allowing for greater stopping power in a variety of generally high velocity rounds, such as rifle rounds or certain armor piercing rounds. I am not entirely aware of the costs, of either system, but even if it was 10 times the cost of current body armor it's cost would still be negligible. At 15,000 dollars per vest/suit, or 10 times current body armor costs, the entire military could obtain these, for 3 million individuals, for roughly 45 billion dollars, and those required would be for currently deployed troops and individuals in training; that would mean 4.50 billion dollars for 300,000, or for slow procurement over 10 years for the entire military or just for the active troops in combat zones/training scenarios. With a relatively high shelf life it shouldn't wear out over time, as well. At 150,000 dollars each it would be 1/20th the budget for those procured for just general combat troops and training, or 1/10th to 1/20th the cost over 5-10 years for the entire military (being fielded to combat troops first).
There are all kinds of options available that could be implemented relatively quickly with relatively minor expenses to the over-all budget. Even so, medical bills and logistical costs make up the bulk of our cost of deployment, in terms of money. Even less soldiers were required, due to their increased effectiveness, in the long term hundreds of billions of dollars could be shaved off, along with the cost of medical bills and veteran rehabilitation costs, which if the armor could protect against the basic threats faced today, from IED's to Ak-47's, would be relatively low.
Do I blame the military for the weaker body armor? No.
But I do somewhat blame people in Congress for most likely suggesting "that sounds expensive" when likely the cost would be nearly negligible and super expensive body armor would still be negligible costs and it's obviously fiscally and generally stupid to not get it. I wouldn't care if it was 10% of our budget, if it could or could have prevented 90% of our current causalities I would be happy, and it still would amount to nearly nothing considering all it would do.
On the Aspect of Cost
The United States's military spent roughly $300 million dollars on interceptor body armor in 2004; an Interceptor system costs roughly $1,585.Theoretically, that equates to roughly 189,274 units or around 190,000 interceptor units; to my knowledge more armor has been purchased since then but I'm not entirely aware how many have been obtained or how much each vest costs.
60,000 MTV and IMTV (improved, modular tactical vest) vests have been purchased by the marines, and another 28,364 vests have been purchased by the navy. Assuming roughly the same price as the Interceptor, around 140 million extra dollars wpi;d have been spent on body armor since august 2008. Both of these have predominately been for the Afghanistan and Iraq efforts.
The soft body armor is at roughly NIJ IIIA rating, or capable of stopping 9mm rounds, .44 Magnums or anything below, but not rifle rounds.
This means that, annually, over the past 10 years, on average, the military has spent roughly 45 million dollars (to my knowledge) on body armor, most of which was for, presumably, the Iraq and Afghanistan effort. Even if it was 70 million dollars on average, annually, That would still be 1/10,000th of the nearly 700 billion dollar budget the military receives. The armor, worn by our primary troops, and individual units, human beings, people, is less than 1/10,000th of our total spending; and that assumes I've underestimated the guess by 35 million dollars a year. This equals roughly 3.5 Apache helicopters, half an F35, or roughly 1/10th of the cost of a single B-2 bomber. The armor worn by medics, tank crews, soldiers on foot, repair crew, practically everyone that can benefit from personal defense, is only given 1/10,000th of the total budget, practically nothing in comparison to our other purchases.
I actually like how much we spend on these expensive vehicles however, and I believe it wholly necessary given their capabilities; I'm not suggesting we getting rid of a B-2 bomber or half an F-35 in order to procure more body armor, but I think the relative costs put the issue of cost into perspective. The military could afford better body armor for it's primary troop scape (about 200,000 people deployed "over seas") for the equivalent of pennies of it's annual costs. The cost of body armor at the moment is nearly negligible, meaning we could theoretically spend gratuitous amounts on body armor, and yet the armor has trouble stopping rifle rounds. Cost is not a significant factor in my opinion, and if there was a more effective option available, say 10 times more expensive, or roughly 15,000 dollars per vest, it's over-all costs would be nothing compared to it's potential impacts (such as actually making our soldiers rifle resistant, opening up a whole new line of capabilities, and protection), or even the medical bills we currently pay, as horrific and awful as that is.
Another option I'm not exactly suggesting, but that will also hopefully give a sense of scale, is nanocomp armor CNT (carbon nano tube) armor. An example of Nano-comp CNT, that currently exists, is a 2mm piece of material, roughly the thickness of a few business cards, that is capable of stopping a 9mm round without significant penetration or destruction of the material. The company says it's targeting growth and maturation akin to carbon fiber, with a target material cost of $350/kg to $400/kg. I'm not sure if it's capable of stopping more rounds, what level of 9mm rounds (fired at higher velocities), or capable of withstanding repeated impacts, but this puts the armor at roughly II armor capabilities assuming minimum capabilities. Comparable Kevlar armor vests of this armor rating are generally around 6mm in thickness.
This makes the nano comp armor roughly 3 times stronger than Kevlar on a thickness to thickness ratio, assuming minimum capabilities. It's possible to assume it has roughly the same mass as Kevlar at this thickness, although it is possible it's slightly less, since carbon nanotubes are around 1.3-1.4 grams per cubic centimeter while Kevlar is around 1.4 The target cost is around 350-400 dollars per kilogram, so each pound is likely 160-180 dollars. Let's just assume it's going to be 500 dollars per kilogram, or 230 dollars per pound, since this is only a predicted mass produced price, and it could be more. A standard 16 lb interceptor vest (made out of this material) would be around 3600 dollars or 2.3 times it's current price. This would have equated in a 690 million dollar cost for 192,000 vests, instead of 300 million, a negligible cost considering the billions spent annually. Perhaps almost equal to a B2 bomber (not that I don't support B2 bombers, I believe their existence and use is necessary). And this is assuming that the armor is 3 times as powerful, which it is likely 4, this is assuming the cost is 500 dollars per kilogram, which it is likely 350-400 (390-560 million dollars), and that there aren't potentially better nanocomp type materials already available. This would mean tripling the effectiveness of modern body armor assuming minimum capacities, for a practically negligible cost.
Although it's easy to see how such an armor system might not appear available. These products are only almost within mass production and their effectiveness has yet to face the rigor of the military or military testing, so perhaps unknown or untested complications will arise.
Something I do believe in supporting immediately is M5 fiber. M5 fiber is about as strong as Kevlar, at around half the thickness and weight, that is elastic, much more heat tolerant, and a very chemically stable material, meaning it's likelihood to break down due to UV, chemicals or heat is relatively low. It's heat tolerance is supposedly above that of nomex, making it potentially better than what current firefighters wear and also potentially useful against warm environments as general armor. Being an Aramid, it carries roughly the same properties as Kevlar, although the elasticity may make it over-all more durable and even potentially more capable for repeated impacts given it's ability to yield when stretch, rather than being permanently stretched and therefore weakened.
Liquid body armor, or nano silica particles suspended in ethylene glycol after being baked in an oven, is known to increase Kevlar's Strength by roughly 3.5 times it's current amount. The material is a non-newtonian fluid, meaning that it behaves like a solid under stress or impact, like corn starch. Four layers of STF-treated Kevlar can dissipate the same amount of energy as 14 layers of neat Kevlar (or 3.5 times, although it has worse performance against knives and low velocity impacts). In addition, STF-treated fibers don't stretch as far on impact as ordinary fibers, meaning that bullets don't penetrate as deeply into the armor or transfer as much blunt force trauma to the target; the researchers theorize that this is because it takes more energy for the bullet to stretch the STF-treated fibers. This means that, potentially, M5 fiber could be flat out 7 times stronger than kevlar on a mass to mass basis, and roughly on a thickness basis; this means that a typical interceptor vest could be 7-8 times more powerful, or powerful enough to warrant fully body armor or more body armor that's lighter but still retains more or equal strength. Combined with increased multiple shot protection due to increased stretch resistant and stretching capability, the armor may provide legitimate multiple round protection against extremely powerful rounds, and particularly against higher velocity rounds, potentially rifle rounds. Both liquid body armor and M5 fiber have higher protection against higher velocity rounds, potentially allowing for greater stopping power in a variety of generally high velocity rounds, such as rifle rounds or certain armor piercing rounds. I am not entirely aware of the costs, of either system, but even if it was 10 times the cost of current body armor it's cost would still be negligible. At 15,000 dollars per vest/suit, or 10 times current body armor costs, the entire military could obtain these, for 3 million individuals, for roughly 45 billion dollars, and those required would be for currently deployed troops and individuals in training; that would mean 4.50 billion dollars for 300,000, or for slow procurement over 10 years for the entire military or just for the active troops in combat zones/training scenarios. With a relatively high shelf life it shouldn't wear out over time, as well. At 150,000 dollars each it would be 1/20th the budget for those procured for just general combat troops and training, or 1/10th to 1/20th the cost over 5-10 years for the entire military (being fielded to combat troops first).
There are all kinds of options available that could be implemented relatively quickly with relatively minor expenses to the over-all budget. Even so, medical bills and logistical costs make up the bulk of our cost of deployment, in terms of money. Even less soldiers were required, due to their increased effectiveness, in the long term hundreds of billions of dollars could be shaved off, along with the cost of medical bills and veteran rehabilitation costs, which if the armor could protect against the basic threats faced today, from IED's to Ak-47's, would be relatively low.
Do I blame the military for the weaker body armor? No.
But I do somewhat blame people in Congress for most likely suggesting "that sounds expensive" when likely the cost would be nearly negligible and super expensive body armor would still be negligible costs and it's obviously fiscally and generally stupid to not get it. I wouldn't care if it was 10% of our budget, if it could or could have prevented 90% of our current causalities I would be happy, and it still would amount to nearly nothing considering all it would do.
Friday, August 24, 2012
Thorium and How to solve our Energy Problems
Thorium and How to solve our Energy Problems
Thorium has the potential to replace our entire energy supply, including electricity, gasoline, and natural gas, either by using the Molten Salt reactor method or the Sub-critical method. The first has the potential to decrease electricity costs to well beyond tens times their current cost, while the sub-critical could potentially decrease electricity costs by 200 times it's current amount. Thorium is only mildly radioactive, producing Alpha Particles which can't get through the skin (low velocity protons), has no chance of a melt down, is 3-4 times more abundant than Uranium, does not require a fuel breeding process, can clean up nuclear waste, and is potentially over 200 times more powerful than current Uranium reactors. Soil on average has about 6 parts per million of Thorium, or around 1 pound per every 166,666.66 pounds of other material, Thorium being 3 times as common as tin and about as common as lead, suggesting that if Thorium was already incredibly dangerous, we'd likely be dead.
The Molten Salt reactor method
The Sub-critical method is capable of reducing the cost of electricity to roughly 1/200th it's current amount.
Even if electricity was just 10 times cheaper, a multitude of things could be possible. Turning mercury into gold, which generally takes roughly 6-9 times the cost in Gold in electricity to do, could now become profitable. More practical things could include creating carbon fiber, who's cost is mostly in production. Nylon and rayon, which are relatively cheap in comparison, which carbon fiber is made from, are relatively cheap in comparison; it is the price of production where the difficulties lie. . If energy and manufacture costs were reduced, some of what could be alleviated by extremely cheap or nearly free energy, then it might be possible to mass produce carbon fiber as replaceable material for steel. Being around 1.5 grams per cubic centimeter, compared to 7.85 for iron, it is less than 1/5th the weight of steel or iron. If much of the furnishing and engine remained the same while most of the frame and other parts of the vehicle were replaced with carbon fiber, vehicles may realistically be 1/3 their initial weight with no other changes in performance. Carbon fiber, which is in many cases stronger than steel, would not only be safer due to it's higher strength but also due to the over-all lighter weight of the vehicle. While more expensive per pound that steel, currently at around 15 dollars per pound, if it was just 1/3 this then carbon fiber would be 5 dollar per pound, or less than a dollar per unit (since the density is lower) making it feasible for replacing steel. In 2004, the U.S. imported roughly 65% of it's energy supply; if we could get to just below 3 times greater efficiency in automobiles, not only could the U.S. reduce it's spending significantly but we could eliminate our dependence on foreign oil and lengthen the life of our own oil reserves, as well as not overstress them. Given the high price of global oil, local oil should be cheaper, decreasing the price Americans spend on oil even more so. Conversely, it may give the needed range and battery life desired of electric cars, as s lighter weight transfers to longer potential distances travel per charge and less stress that is required to transport the vehicle, meaning that the engine and battery life are tippled. By simply reducing the weight of the vehicle, much is improved; by having more strength and being lighter weight, safety is improved phenomenally.
Subscribe to:
Posts (Atom)