RSS 2.0 Feed

» Welcome Guest Log In :: Register

Pages: (51) < [1] 2 3 4 5 6 ... >   
  Topic: forastero's thread< Next Oldest | Next Newest >  
forastero



Posts: 458
Joined: Oct. 2011

(Permalink) Posted: Dec. 04 2011,11:24   

Quote (OgreMkV @ Dec. 03 2011,20:57)
Quote (JonF @ Dec. 03 2011,19:52)
Let's get down and dirty! Let's do some numbers!

We just dug up a fine, pristine sample of subaerial (solidified in the air) lava and we want to know how old it is. We'll use K-Ar dating because the equations are simple, although probably too complex for our friend. So we take it to the lab, and they do the measurements and report the the 40Ar/40K ratio is 0.036738. So we plug that value in our age equation:



where lambda is the total decay constant of 40K, 0.0000000005543 per year according to mainstream science.

(Oohh! Scary!1!!one!! An equation!! with a Greek letter1!!!1one And a natural logarithm11!!!!1111one11!!ONE1. It's OK, forastero, we know you can't handle such complexity.)

And the answer is ..... 542,000,000 years! Just at the beginning of the Cambrian.

But our ol' pal forastero tells us that rock is less than 6,000 years old. And that the decay constant isn't really constant. So let's investigate that. Assume the decay constant was something else when the rock solidified, and just changed to the modern value a few hundred years ago when we wouldn't notice. What change in the decay constant would produce a rock with a 40Ar/40K ratio of 0.036738 in less than 6,000 years?

10,000,000 percent. I.e., the decay constant would have to be 100,000 times larger to produce that ratio in less than 6,000 years. To be exact, in 5,420 years.

100,000 times the radiation intensity. 100,000 times the heat intensity. And that's spreading out the change over all 5,420 years. If forastero wants the change to be over a shorter period of time ... well ... even more radiation and heat intensity.

Here's a plot of how the age of my hypothetical rock would be affected by changes in the decay constant, spread out over the entire life of the rock. (Forastero: it's a log-log plot so the result is a nice straight line. I realize you don't understand any of that.)


This is a nit, but what if we suppose that the change in decay is a constant rate change over time.  The reason I suggest this is because a constant rate change is easier to explain than a massive decrease in a very short time.  (Forastero doesn't have an explanation or even the beginnings of a plan for figuring either one out, but that's not unexpected.)

For about 50-60 years it has been constant as far as we can tell, but we've only had really, really good measurements for what, 20-30 years?

So let's use forastero's number of 0.5% and say that's the change over 50 years? Everyone OK with that?

I, however, refuse to delve into the calculus this would require.  I have had a horrible row with my child and I don't feel like it.

Regardless, I can make some predictions.  Since the rate now is very slow, then the rate must have been even higher than JonF's calculated value when the rock was formed.

So, I would actually think that JonF's calculated percentage change would be the median point (roughly) and when the rock was formed the decay constant would have to be like 100,000,000 times larger and has been slowly over time.

And, at the rate we're going now, it'll only be a few more decades before the decay constant is zero and radiation will stop.  We should probably alert the NRA (no, not that NRA, the Nuclear Regulatory Agency) and tell them it's pretty much futile to build anymore fission reactors.  And tritium watches and sights... useless in just a few short decades... and you can forget about radiation therapy and radiation-based medical imaging.  

Wow.  Scary stuff...

But even weirder will be when the decay rate goes negative and atoms that previously underwent nuclear fission start fusing with random alpha, beta, and gamma particles.  That's gonna be creepy as hell.

Then, I guess that since the universe will have to have negative entropy as a whole at this point, the expansion will stop and the Big Crunch will begin.  In fact, in just another 6000-7000 years, the entire universe will be crunched beyond the point where physics can predict what's going on.

All because some clown couldn't accept that radioactive decay really is a constant.

I just thought of something else... that would explain a lot.  Since the decay rate was effectively infinite at the time of the Big Bang, then all the matter and energy in the universe was actually bound up into a single nucleus of nearly infinite proportions.  The Big Bang really was a fission explosion.

The sad part is, except for the complete lack of evidence, this actually makes more sense than what forastero has been preaching.

Again, your so caught up in psuedouniformitarianism that you insist that even the unsteadyness is steady. The fact of the matter though is that varying alterations have been found just days apart. heck you and tracy even admitted last week or so that cosmic rays have different influences in different places. The reason for these differences is due to vast variables and catalysts

The problem is that every time these decay oscillation are rediscovered, they get swept back under the rug

  
  1510 replies since Oct. 21 2011,05:55 < Next Oldest | Next Newest >  

Pages: (51) < [1] 2 3 4 5 6 ... >   


Track this topic Email this topic Print this topic

[ Read the Board Rules ] | [Useful Links] | [Evolving Designs]