The Harper government has recently been touting a number of new initiatives in the High North of Canada as a way to assert Canadian sovereignty over the region. (The CBC has a nice summary map of where a number of these projects are planned to occur) I think it’s interesting to note though, that while the government has been proposing all these new plans, those of us actively involved in Arctic research know that the Harper government has been steadily cutting back on ways for scientists to access the region. For example, Polar Continental Shelf Program (PCSP) this past year had half the fuel of past years to distribute to researchers for logistical support, which meant if you wanted to work in the Arctic you had to come up with a lot more money than in the past. Admittedly, this was partly due to an increase in fuel prices and not just cutbacks, but what about other research funding? NSERC has seen a large number of cuts to its programs, at both the professor and student level. This means that costly field work in the Arctic simply cannot be done for lack of funds. For years scientists have been establishing our sovereignty in the region by doing research to find out what is actually there. While the Harper government is worried about using the military to assert our power over the region, wouldn’t it be just as (if not more) important to send researchers up to see what’s there? It makes both scientific (knowledge) and economic (resource exploitation) sense, while at the same time showing the world that we have a vested interest in our own land.
I recently ran across a nice summary of two papers in Science about the lack of understanding in biology of the way biological software, and computational biology in general, operate. The Science article authors opine about the lack of knowledge in biological labs of how the computational tools — gene sequencers for example — actually operate. I would tend to argue that the same could be said for almost any research lab today, with virtually any subject. I feel these researchers are perhaps longing for the mythic days of yore, when a man of knowledge did know much about everything, as there was arguably much less to know. I think every research lab out there probably is lacking in it’s knowledge of something, whether it’s the biologist who doesn’t understand his computational tools, or the computational biologist who knows too little about how actual organisms live and reproduce. So while I can understand why they worry, I wonder if this is just simply the way things are to be in a world of super-specialists.
Science, 2009. DOI: 10.1126/science.1173876
Science, 2009. DOI: 10.1126/science.1176016
I recently was working with some maps, trying to create some pretty pictures for my thesis using some of the maps by Ron Blakey. However I was too lazy to plot points on the maps by hand and I wanted to georeference the maps. I found a hand tool with QGIS that was easy enough to use, but wasn’t as precise (or as quick) as another command-line tool I found called gdal_translate. GDAL (Geospatial Data Abstraction Library) and the associated OGR comprise a number of tools and libraries that are very useful for both raster and vector geographic image manipulation. Using gdal_translate was easiest with a straight up lat/long projection, as I could just give the geographic coordinates for each of the corners and it would spit out a GeoTIFF. Thus, I was able to quickly take some of the rectangular maps and georeference and have something that I could use in QGIS. While I was playing around gdal_translate, I found another tool called gdalwarp, which has been giving me a bit more of a headache. Theoretically, gdalwarp should let me transform some of these rectangular maps into different, possibly ‘prettier’ looking maps. So far though, all I’ve managed is to make some seriously un-pretty images. I’ll post an update in a little while giving some quick instructions once I figure out how to make it all work.
A paper from the Journal of Biogeography was published online today about ecological niche modeling the Sasquatch. While the name of the paper may excite a lot of cryptozoologists and Bigfoot believers, the authors use this oft-sighted-never-captured creature as a good example of why questionable occurrence data should always be taken with a grain of salt. The paper actually concludes by stating that based on the overlap of the predicted range of Satchmo and black bears, most of the sightings reported have been mistaken identities (although it is difficult to tell if they mean Bigfoot is really just a black bear, or if black bears are actually just really hairy cousins of ours).
The Guardian has a report on tool marks found on Neanderthal bones that one archaeologist argues is evidence that we used them for barbeque. While the tool marks are inconclusive as to the final fate of the flesh from the bones, it does provide some evidence that Neanderthals and Humans were interacting in at least some fashion. I always find it an interesting thought that we were not always alone on the planet when it came to (arguably) concious beings.
I just uploaded a Python plugin that I made called PostGPS to the QGIS repositories. It is still very rough, and I will likely be rewriting most of it shortly, but I just put it up for a field course that I will be teaching. The most important part of the tool is that it can convert any point layer (or only selected points) from within QGIS and convert it to a GPX format. We will likely be using this a fair bit, as we will have a number of crews on the go simultaneously while we’re in Saskatchewan, and each crew will need their own GPS with the coordinates to get to the sites they should be collecting. Hopefully I find time to do a bit of a rewrite before we are in the field, and get it working a little smoother (and remove the dependency for the psycopg module, as it’s not really necessary any more).
A recent article on from National Geographic describes how an Italian stonecutting company (which makes countertops) was slicing through a piece of Egyptian limestone when they came across the fossilized bone of an Eocene whale embedded inside. Additionally, after palaeontologists traced the origin of the rock back to the quarry, they also found another layer above the limestone containing a cornucopia of smaller mammal fossils. My only thought at the video clip was that the next time we need to move overburden, we should rent an earthmover like the one they’re using.
The Scientist (subscription required) published a story recently about how Merck — famous(!) for such drugs as Vioxx — created a fake journal to publish findings favourable to its products. The journal, entitled Australasian Journal of Bone and Joint Medicine, was even published by Elsevier, one of the largest journal publishers around. Obviously, I’m not aware of a lot of palaeontological groups that would have the marketing budget to pull this off, although the anti-evolution Creation Research Society does publish a journal called (unimaginitively) the Creation Research Society Quarterly Journal. While the public health issues with the Merck journal are obviously of much more importance, the CRS journal demonstrates that you don’t need a huge marketing budget to publish misinformation (although the CRS would argue that’s what I do too). I would imagine that in the migration to electronic formats for journals, that the financial barrier to entry in self-publishing a “science” journal would be getting smaller every day. Who knows, maybe soon we’ll have the Matthew Vavrek Journal of Awesome and Totally Accurate Research.
I have been eyeing up the Nokia n810 tablet recently, thinking about how useful it might be in a field setting. The n810 is essentially a small, slimmed down computer running on an ARM processor (versus your typical Intel chip in most computers). ARM chips are used in most low-power devices such as mobile phones, where battery life is an important consideration. Likewise, out in the field we don’t always have easy access to electricity, although we have started to experiment with solar panels and battery packs. There are similar (and likely more computationally powerful) devices that use Intel chips, but at this point their price is a fair bit higher, and from what I could see the power draw is greater as well. The thing about the n810 is that it can be charged from a USB port, versus the much higher power requirements of something like a laptop. While the n810 comes with a Linux distribution called Maemo, people have recently managed to port Ubuntu to the device (as well, Ubuntu has been working up an ARM distribution). By using something like Ubuntu on the n810, we have a full suite of applications that we can use, from word processing (although I think OpenOffice on it would be a little unwieldy) to GIS.
While ths seems like a neat idea, the question becomes also, why? While some of the things like GIS might be nice, so that we could be updating our databases in the field, they’re probably not necessary. I have thought about whether it would be useable for reading pdfs or books, to cut down on weight, but how much weight would we really cut out if we need portable power solutions? I still would like to try it out for writing things like manuscripts, but the small keyboard size may make this a problem too.
Nonetheless, I think that it could be a worthwhile thing to try, as the costs to try it out are always coming down.
The yearly vertebrate field course is coming up in about a month, and we’ve started some more serious planning for it. One thing I am thinking about is the best way to try and teach people how to use some simple GIS and statistics programs, possibly while in a cook tent in the field, with everyone on their own laptops of every variety. I have been tinking of setting up a small server and local network that everyone can get on as a way to share information quickly, without resorting to swapping usb keys. I had the idea of getting something like a Eee or Acer Aspire One and turning it into a server, and using a cheap wireless router (we have an old one in the lab, but they only cost $25 anyhow) to set it up. I suppose I probably could go without the router, but it might make things easier than also trying to deal with setting up an ad hoc network on the server. I thought that by using something small and low powered like an Atom processor based netbook, we wouldn’t have to worry about power draw or even if it gets broken, as they’re so cheap. As it is, it would only really be used as a file server and not be doing much number crunching itself, so I would think it should work fine (if a little slowly).
Once the server part is set up, I need to get everyone with a copy of R and Quantum GIS on their own laptops. With QGIS everyone can access and view mapfiles I’ll have on the mini-server, and even update things if I want to let them if we find new localities. They can also use their own computers to download any old localities to their GPS units, instead of me doing it all. As for R, We can have an “Intro to R” session and get everyone running the same data set (preferably of something we have found in the field) and then get everyone to even upload their finished results to the server to be marked if we’re really ambitious.
However, if this whole idea doesn’t pan out so well, I’m still playing around with creating live USB keys for everyone to boot into. It’s easy enough to create them now with Ubuntu 9.04, but the only thing I need to figure out is how to get it to work with OS X. I thought as well about live CDs with the additional components, but then there is an issue of data permanence, as well as problems with computer speed. As well, if people start showingup with netbooks, CDs won’t work anyhow. At any rate, one of these ideas should work.