Monthly climatic data from the Eads Bridge, from 1893 to the 1960’s. It’s a comma separated file (.csv) that can be imported into pretty much any spreadsheet program.
The last three columns are mean (MMNT), minimum (MNMT), and maximum (MXMT) monthly temperature data, which are good candidates for analysis by pre-calculus students who are studying sinusoidal functions. For an extra challenge, students can also try analyzing the total monthly precipitation patterns (TPCP). The precipitation pattern is not nearly as nice a sinusoidal function as the temperature.
Students should try to deconstruct the curve into component functions to see the annual cycles and any longer term patterns. This type of work would also be a precursor the the mathematics of Fourier analysis.
This data comes from the National Climatic Data Center (NCDC) website.
The website Influence Explorer has a lot of easily accessible data about the contributions of companies and prominent people to lawmakers. As a resource for civics research it’s really nice, but the time series data also makes it a useful resource for math; algebra and pre-calculus, in particular.
The first has to do with the spark-the-imagination excitement that comes from being able to work with a live, real, scientific record (updated every month) that is so easy to grab (from Scrippts), and is extremely relavant given all the issues we’re dealing with regarding global climate change.
The second is that the data is very clearly the sum of two different types of functions. The exponential growth of CO2 concentration in the atmosphere over the last 60 years dominates, but does not swamp, the annual sinusoidal variability as local plants respond to the seasons.
Add an exponential curve trendline in a spreadsheet program or manual regression. If using the regression (which I’ve found gives the best match) your equation should have the form:
while the built-in exponential trendline will usually give something simpler like:
2. Subtract the exponential function.
Put the exponential function (model) into your spreadsheet program and subtract it from data set. The result should be just the annual sinusoidal function.
If you look carefully you might also see what looks like a longer wavelength periodicity overlain on top of the annual cycle. You can attempt to extract if you wish.
3. Decipher the annual sinusoidal function
Try to match the stripped dataset with a sinusoidal function of the form:
A good place to start at finding the best-fit coefficients is by recognizing that:
a = amplitude;
b = frequency (which is the inverse of the wavelength;
c = phase (to shift the curve left or right); and
d = vertical offset (this sets the baseline of the curve.
Wrap up
Now you have a model for carbon dioxide concentration, so you should be able to predict, for example, what the concentration will be for each month in the years 2020, 2050 and 2100 if the trends continue as they have for the last 60 years. This is the first step in predicting annual temperatures based on increasing CO2 concentrations.
Inverse relationships pop-up everywhere. They’re pretty common in physics (see Boyle’s Law for example: P ∝ 1/V), but there you sort-of expect them. You don’t quite expect to see them in the number of views of my blog posts, as are shown in the Popular Posts section of the column to the right.
Table 1: Views of the posts on the Montessori Muddle in the previous month as of October 16th, 2012.
And if you think about it, part of it sort of makes sense that this relationship should be inverse. After all, as you get to lower ranked (less visited) posts, the number of views should asymptotically approach zero.
Questions
So, given this data, can my pre-Calculus students find the equation for the best-fit inverse function? That way I could estimate how many hits my 20th or 100th ranked post gets per month.
Can my Calculus students use the function they come up with to estimate the total number of hits on all of my posts over the last month? Or even the top 20 most popular posts?
One of the first things that my pre-Calculus students need to learn is how to do a least squares regression to match any type of function to real datasets. So I’m teaching them the most general method possible using MS Excel’s iterative Solver, which is pretty easy to work with once you get the hang of it.
I’m teaching the pre-Calculus using a graphical approach, and I want to emphasize that the main reason we study the different classes of functions — straight lines, polynomials, exponential curves etc.— is because of how useful they are at modeling real data in all sorts of scientific and non-scientific applications.
So I’m starting each topic with some real data: either data they collect (e.g. bring water to a boil) or data they can download (e.g. atmospheric CO2 from Mauna Loa). However, while it’s easy enough to pick two points, or draw a straight line by eye, and then determine its linear equation, it’s much trickier if not impossible when dealing with polynomials or transcendental functions like exponents or square-roots. They need a technique they can use to match any type of function, and least squares regression is the most commonly used method of doing this. While calculators and spreadsheet programs, like Excel, use least squares regression to draw trendlines on their graphs, they can’t do all the different types of functions we need to deal with.
The one issue that has come up is that not everyone has Excel and Solver. Neither OpenOffice nor Apple’s spreadsheet software (Numbers) has a good equivalent. However, if you have a good initial guess, based on a few datapoints, you can fit curves reasonably well by changing their coefficients in the spreadsheet by hand to minimize the error.
I’m working on a post on how to do the linear regression with Excel and Solver. It should be up shortly.
Notes
If Solver is not available in the Tools menu you may have to activate it because it’s an Add In. Wikihow explains activation.
Some versions of Excel for the Mac don’t have Solver built in, but you can download it from Frontline.
Want to find your nearest superfund site? The EPA has an interactive page called, Clean Up My Community, that maps brownfields, hazardous waste, and superfund sites anywhere in the U.S.
Note:
Brownfields are places, usually in cities, that can’t be easily re-developed because there’s some existing pollution on the site.
Superfund sites are places where there is hazardous pollution that the government is cleaning up because the companies that caused the pollution have gone out of business, or because the government caused the pollution in the first place. The military is probably the biggest source of government pollution, particularly from fuel leaks and radioactive waste.
The EPA’s Enviromapper website is great way to identify sources of hazardous materials and other types of pollution in your area, which might be a good way of stirring up student interest in the topic.
Not only can you map the broad category of pollution – air, water, radiation etc – but you can also find specific information about the different types of pollution or potential pollution the EPA has information about. I found a nearby site with sulfuric acid, for example.
And, if you want to slog through a lot of closely written reports, you can find a lot more details about any site you come across. Some of this information might also be useful – who knows?