What is Dark Matter?

Adam Hadhazy, in Discover Magazine, summarizes the top candidates to explain dark matter and the experiments in progress to find them. These include, WIMPs (Weakly Interacting Massive Particles, Axions, Sterile Neutrinos, and SIMPs (Strongly Interacting Massive Particles.

Distortions in the shapes of galaxies caused by gravitational lensing. While gravitational lensing is caused by anything with gravity (this means normal matter as well) the lensing effect of dark matter is a key form of evidence for its presence. Image of the galaxy cluster Abell 2218 via Wikimedia Commons.

via Brian Resnick on Vox, who provides some very interesting historical context on the discovery of dark matter.

Electron Configuration Practice

A quick electron configuration practice webpage that lets you enter the symbol for an element and see if you can write out the electron configuration in both the full and noble gas forms.

Screen capture from the electron configuration webpage. Sulfur (S) is entered, and then the long form and noble gas form of the configurations can be entered and checked. In this case, there is an error in one part of the noble gas form.

The table at the bottom is a guide to filling the electron shells and orbitals. You can click any of the blue squares to change the number of electrons in the orbital.

Update

An improved version of the lower, table part is here.

Linearizing an Exponential Function: Radioactive Decay

Using this data for the decay of a radioisotope, find its half life.

t (s)A (g)
0100
10056.65706876
20032.10023441
30018.18705188
40010.30425049
5005.838086287
6003.307688562

We can start with the equation for decay based on the half life:

   A = A_0 (\frac{1}{2})^\frac{t}{\lambda}  

 
where:
 A = \text{Amount of radioisotope (usually a mass)}  A_0 = \text{Initial amount of radioisotope (usually a mass)}   t = \text{time}  \lambda = \text{half life}  

and linearize (make it so it can be plotted as a straight line) by using logarithms.

Take the log of each side (use base 2 because of the half life):

  \log_2{(A)} = \log_2{  \left( A_0 (\frac{1}{2})^\frac{t}{\lambda} \right)} 

Use the rules of logarithms to simplify:

 \log_2{(A)} = \log_2{ ( A_0 )} + \log_2{  \left( (\frac{1}{2})^\frac{t}{\lambda} \right)}   

  \log_2{(A)} = \log_2{ ( A_0 )} +  \frac{t}{\lambda}  \log_2{   (\frac{1}{2}) }      

 \log_2{(A)} = \log_2{ ( A_0 )} +  \frac{t}{\lambda}  (-1)   

  \log_2{(A)} = \log_2{ ( A_0 )} -  \frac{t}{\lambda}       

Finally rearrange a little:

  \log_2{(A)} =  -  \frac{t}{\lambda}  +  \log_2{ ( A_0 )}      

  \log_2{(A)} =  -  \frac{1}{\lambda} t +  \log_2{ ( A_0 )}       

Now, since the two variables in the last equation are A and t we can see the analogy between this equation and the equation of a straight line:

 \log_2{(A)} =  -  \frac{1}{\lambda} t +  \log_2{ ( A_0 )}        

and,

   y =  m x +  b       

where:

   y = \log_2{(A)}        

   m =  -  \frac{1}{\lambda}        

   x = t       

   b =  \log_2{ ( A_0 )}        

So if we draw a graph with log₂(A) on the y-axis, and time (t) on the x axis, the slope of the line should be:

   m =  -  \frac{1}{\lambda}        

Which we can use to find the half life (λ).

Radioactive Half Lives

Since we most commonly talk about radioactive decay in terms of half lives, we can write the equation for the amount of a radioisotope (A) as a function of time (t) as:

  A = A_0 (\frac{1}{2})^\frac{t}{\lambda} 

where:
 A = \text{Amount of radioisotope (usually a mass)}  A_0 = \text{Initial amount of radioisotope (usually a mass)}   t = \text{time}  \lambda = \text{half life} 

To reverse this equation, to find the age of a sample (time) we would have to solve for t:

Take the log of each side (use base 2 because of the half life):

  \log_2{(A)} = \log_2{  \left( A_0 (\frac{1}{2})^\frac{t}{\lambda} \right)} 

Use the rules of logarithms to simplify:

 \log_2{(A)} = \log_2{ ( A_0 )} + \log_2{  \left( (\frac{1}{2})^\frac{t}{\lambda} \right)}   

  \log_2{(A)} = \log_2{ ( A_0 )} +  \frac{t}{\lambda}  \log_2{   (\frac{1}{2}) }      

 \log_2{(A)} = \log_2{ ( A_0 )} +  \frac{t}{\lambda}  (-1)   

 \log_2{(A)} = \log_2{ ( A_0 )} -  \frac{t}{\lambda}      

Now rearrange and solve for t:

 \log_2{(A)} - \log_2{ ( A_0 )} = -  \frac{t}{\lambda}      

 -\lambda \left( \log_2{(A)} - \log_2{ ( A_0 )} \right) = t      

  -\lambda \cdot \log_2{ \left( \frac{A}{A_0} \right)}  = t      
 

So we end up with the equation for time (t):

  t = -\lambda \cdot \log_2{ \left( \frac{A}{A_0} \right)}         
 

Now, because this last equation is a linear equation, if we’re careful, we can use it to determine the half life of a radioisotope. As an assignment, find the half life for the decay of the radioisotope given below.

t (s)A (g)
0100
10056.65706876
20032.10023441
30018.18705188
40010.30425049
5005.838086287
6003.307688562

Comparing Covid Cases of US States

Missouri’s confirmed cases (z-score) compared to the other U.S. states from April 20th to October 3rd, 2020. The z-score is a measure of how far you are away from the average. In this case, a negative z-score is good because it indicates that you’re below the average number of cases (per 1000 people). For all the states.

Based on my students’ statistics projects, I automated the method (using R) to calculate the z-score for all the states in the U.S. We used the John Hopkins daily data.

I put graphs for all of the states on the COVID: The U.S. States Compared webpage.

The R functions (test.R) assumes all of the data is in a folder (COVID-19-master/csse_covid_19_data/csse_covid_19_daily_reports_us/), and outputs the graphs to the folder ‘images/zscore/‘ which needs to exist.

covid_data <- function(infile, state="Missouri") {
    filename <- paste(file_dir, infile, sep='')
    mydata <- read.csv(filename)
    pop <- read.csv('state_populations.txt')
    mydata <- merge(mydata, pop)
    mydata$ConfirmedPerCapita1000 <- mydata$Confirmed / mydata$Population *1000
    summary(mydata$ConfirmedPerCapita1000)
    stddev <- sd(mydata$ConfirmedPerCapita1000)
    avg <- mean(mydata$ConfirmedPerCapita1000)
    cpc1k <- mydata[mydata$Province_State == state,]$ConfirmedPerCapita1000
    zscore <- (cpc1k - avg)/stddev
    #print(infile, zscore)
    return(zscore)
}


get_zScore_history <-function(state='Missouri') {
  df <- data.frame(Date=as.Date(character()), zscore=numeric())
  for (f in datafiles){
    dateString <- as.Date(substring(f, 1, 10), format='%m-%d-%y')
    zscore <- covid_data(f, state=state)
    df[nrow(df) + 1,] = list(dateString, zscore)
  }
  df$day <- 1:nrow(df)

  plot_zScore(df, state)


  # LINEAR REGRESSIONS:
  # http://r-statistics.co/Linear-Regression.html
  lmod <- lm(day ~ zscore, df)
  return(df)
}

plot_zScore <- function(df, state){
  max_z <- max( abs(max(df$zscore)), abs(min(df$zscore)))
  print(max_z)


  zplot <- plot(x=df$day, y=df$zscore, main=paste('z-score: ', state), xlab="Day since April 20th, 2020", ylab='z-score', ylim=c(-max_z,max_z))
  abline(0,0, col='firebrick')
  dev.copy(png, paste('images/zscore/', state, '-zscore.png', sep=''))
  dev.off()
}

get_states <- function(){
  lastfile <- datafiles[ length(datafiles) ]
  filename <- paste(file_dir, lastfile, sep='')
  mydata <- read.csv(filename)
  pop <- read.csv('state_populations.txt')
  mydata <- merge(mydata, pop)
  return(mydata$Province_State)
}

graph_all_states <- function(){
  states <- get_states()
  for (state in states) {
    get_zScore_history(state)
  }
}

file_dir <- 'COVID-19-master/csse_covid_19_data/csse_covid_19_daily_reports_us/'
datafiles <- list.files(file_dir, pattern="*.csv")

print("To get the historical z-score data for a state run (for example):")
print(" > get_zScore_history('New York')" )

df = get_zScore_history()

You can run the code in test.R in the R console using the commands:

> source('test.R')

which does Missouri by default, but to do other states use:

> get_zScore_history('New York')

To get all the states use:

> graph_all_states()

Missouri COVID-19

For a Statistics project, I took raw COVID data from John Hopkins University on May 20, 2020. With the data, I found the general statistics and then compared how cases are going up in Missouri every month.

StateConfirmedDeathsPopulationCasesPerCapita
Alabama1305252247797362.73069475
Alaska401107102310.564605037
Arizona1490674763920172.33197127
Arkansas500310729159181.715754695
California859973497372539562.30839914
Colorado22797129950291964.532931308
Connecticut390173529357409710.91660355
Delaware81943108979349.125392289
District of Columbia755140770574910.69927127
Florida474712096188013102.524877256
Georgia39801169796876534.108425436
Hawaii6431713603010.4726895003
Idaho25067715675821.598640454
Illinois1004184525128306327.826426633
Indiana29274186464838024.514943547
Iowa1562039330463555.127439186
Kansas850720228531182.981650251
Kentucky816737643393671.88207174
Louisiana35316260845333727.790227672
Maine18197313283611.369356673
Maryland42323212357735527.330496027
Massachusetts889706066654762913.5881248
Michigan53009506098836405.363307445
Minnesota1767078653039253.331495072
Mississippi1196757029672974.032963333
Missouri1152864059889271.92488571
Montana478169894150.4831137591
Nebraska1112213818263416.089771844
Nevada738837727005512.735738003
New Hampshire386819013164702.938160383
New Jersey15077610749879189417.14943333
New Mexico631728320591793.067727478
New York354370286361937810218.28713669
North Carolina2026272695354832.124905471
North Dakota2095496725913.114820151
Ohio294361781115365042.551552879
Oklahoma553229937513511.474668726
Oregon380114438310740.992149982
Pennsylvania681264770127023795.36324731
Rhode Island13356538105256712.68897847
South Carolina917540746253641.983627667
South Dakota4177468141805.130315164
Tennessee1841230563461052.90130718
Texas516731426251455612.054955147
Utah77109027638852.789551664
Vermont944546257411.50861139
Virginia32908107580010244.112973539
Washington18971103767245402.821159514
West Virginia15676918529940.8456584317
Wisconsin1341348156869862.35854282
Wyoming787115636261.396315997

The Table above is the raw data I extracted but I added the population of each state and then calculated the cases per capita by dividing the confirmed cases by the population. This allows you to compare each state equally.

After getting the raw data I did the statistical analysis on the confirmed cases and cases per capita.

Confirmed Cases

Min.401
Q15268
Median13052
Q334112
Max354370
Mean30364
Inter-Q28844
Standard Div5513.53
Missouri11528
Missouri Z-3.416323118

The data above is the analysis from the confirmed cases. The analysis is for all 50 states.

Confirmed Cases per Capita

Min.0.4727
Q11.9543
Median2.9013
Q35.2468
Max18.2871
Mean4.4639
Inter-Q3.2925
Standard Div4.101132
Missouri1.92488571
Missouri Z-0.6191008458

The data above is the analysis from the confirmed cases per capita. The analysis is for all 50 states.

Missouri Predictions

After I did the analysis for all 50 states I focused on the rise of cases in Missouri from April to September. Then I predicted the number of cases in the future if the rise in cases stays the same. More than likely the cases will be higher or lower than the predicted number. If the state implements safety precautions the curve could flatten out. If the state does nothing and people keep taking it less and less seriously than more then likely the curve will get stepper.

Above are the data and graphs I used to predicate the cases at the beginning of October and End. The two highlighted boxes are the predictions.

I predict there will be 130,278 cases in Missouri on the first of October. On the 21st I predict there will be 166,268 cases.