I grew up on science fiction. I loved reading stories of The Future, where space travel was commonplace, where all energy was generated cleanly, and where we worked side-by-side with machine intelligences to accomplish tasks.
Of course, that wasn’t what most of the books were about; space travel, clean energy, and A.I. were just part of the background. And many of the books I read had issues– most of them failed to properly predict the role that women play in society (or often failed to credit them at all, aside from being a love interest for the hero), quite a few were somewhat racist, and all of them glossed over problems of everyday life that couldn’t simply be resolved with technology– but in the background areas of space travel, A.I., and clean energy, they were quite prescient.
“But Ryan, none of those have happened yet,” I hear you saying. And you’re right, none of these have yet reached their full potential. But in a world where SpaceX, Blue Origin, Rocket Labs, Virgin Galactic, and so many others are all working to bring affordable spaceflight to the masses (and getting pretty darn close), I think it’s safe to say that space travel will soon[1. Assuming a generous definition of “soon”] become far more commonplace than it is today. We won’t be living in Star Trek, but costs will come down and won’t be the same barrier that they are today.
And in a world where renewable energy continues to make great strides, I think it’s safe for me to predict that we are well on the way to moving the bulk of our electrical generation away from fossil fuels– there are some problems to solve with electrical transmission and energy storage, but we keep moving closer to to a solution. As more companies compete for a slice of the renewable energy pie, prices will go down, efficiency will go up, and an increasing percentage of electrical generation will be from renewable resources. Energy will increasingly be stored in batteries of one kind or another, whether they be chemical (such as Tesla’s Powerpacks), mechanical (such as hydropower or compressed air), or thermal (such as molten salt paired with a concentrated solar tower). Those batteries will allow more and more of the baseload supply of energy to come from renewables, reducing dependence on fossil fuels, and paving the way for those to be slowly phased out.
Lastly, we already work incredibly closely with machine intelligences, though that intelligence isn’t quite what was predicted (at least, not yet). Work is increasingly done on computers with incredibly complicated error checking algorithms, designed to reduce mistakes that cost money and lives. Our data is sorted and analyzed by powerful computers that can correlate information thousands of times faster than we can, letting us build more complicated and accurate models to help us learn even more about the world we live in. And most impressively (for me), we can now literally talk to our computers and ask them to do things for us. I know that the first two examples are far more impactful to society, but I still can’t quite believe that I can just talk to my Google Home, have it correctly interpret what I’m asking for, and then give me an answer that makes sense and fulfills my need. It still feels like magic.
The science fiction writers I read in my youth may have written about a world that was full of more straightforward adventure than the one we live in today. But what our world lacks in swashbuckling heroes with laser swords, it more than makes up for with the amount of sheer technical wonder that we have at our fingertips each day. Fifty years ago, for me to write and publish this article would have required a bulky typewriter, a mimeograph machine to create copies of the typed article, a bunch of paper, and a stapler to staple my article up on random light poles for people to read. Today, I wrote most of this post on my Chromebook connected to the hotspot on my phone, while waiting for my car’s inspection to be finished. To publish it, I hit the big blue “publish” button at the top of the page, and trusted it would be sent out to your screen when you wanted to read it. The amount of time and effort saved by modern technology for this article alone could probably cover my grocery bill for a week. And if that’s not evidence of us living in the future, I don’t know what is.
The vulnerability was patched in WordPress v4.7.2 two weeks ago, but millions of sites haven’t yet updated. This leaves them open to a vulnerability in the WordPress REST API, which can allow malicious actors to edit any post on a site.
Ars Technica has a very nice writeup on the effects of the exploit, which has resulted in the defacement of a staggering number of websites (including the websites of Glenn Beck, the Utah Office of Tourism, and even the official Suse Linux site). Sucuri and Wordfence also have very good articles about the effects of the vulnerability.
If you have a WordPress site, you should immediately check to make sure you’re on the latest version (v4.7.2).
I’ve finally moved to a VPS on DigitalOcean, from my previous (free) shared hosting. I did this for a couple of reasons: first, while my hosting was free for a year with my domain name, that year was almost up. To renew my hosting for the second+ year, I would have needed to pay $38.88/year; while that’s a decent price, I looked at my options and decided that moving to DigitalOcean wouldn’t cost much more (around $30 more across the year, since I use the weekly backups option), would give me much more control over my server (now I get SSH access!), and would centralize all of my VPS instances in the same place (I’ve used DigitalOcean for several years to host various projects).
Of course, as with so many things, this migration wasn’t sparked by a simple glance at the calendar. While I’ve intended to move my host for the last month or two, the timing was decided by my messing up a WordPress upgrade on the old site at the beginning of December. I used the automatic updater, ignored the warnings about making sure everything was backed up first[1. I didn’t actually ignore this warning. I had a backup plugin configured on the site; I figured I could probably roll back if I really needed to.], and told it to apply the new version. When WordPress exited maintenance mode, I was locked out of the administration dashboard. The public part of the website was still up and running, but the backend was locked off. Since I was entering finals week at my university, I decided to just let it be until I had some time to come back and fix it. Worst-case, I had backups I could restore from, and I’d been meaning to migrate my site anyway.
Of course, things didn’t work out that way. When I finally had some time on Christmas Eve, I discovered that a complete backup hadn’t been made in months.
Turns out, if you don't verify that your backups are working properly, they might not be a viable restore medium.
Yes, I committed the cardinal sin of not verifying the state of my backups. Apparently I’d screwed something up with their configuration, and I’d never tried to restore from them before and hadn’t noticed until I needed them. At this point, I decided that if the backups weren’t working, there was no point in trying to recover on a host that I was going to be abandoning within a month, and I spun up a WordPress droplet on DigitalOcean to hold the rebuilt site.
But I've been meaning to move my hosting over to a VPS on @digitalocean for a while now, and this is a perfect opportunity to do that.
I still had copies of all the content that was on the site, so I’d be able to restore everything without much trouble. Some copy/pasting and time would be required, but I could get everything back to the way it was without too much trouble. But before I did all of that, I thought “what if I’m overlooking something really simple with the old site?” I did a little searching, and apparently W3 Total Cache, which I used to create static pages for my site and decrease load times, can cause problems with WordPress upgrades. I disabled that via FTP[2. If you’re in a similar situation, just renaming the plugin folder to something else– w3-total-cache to w3-total-cache123, for example– will disable it], reloaded the site, and I was able to access the admin area again. Turns out the simple steps that you should take before completely rebuilding everything are actually worth it.
And of course, while I move from one host to another, I solved my original problem. C'est la vie.
Since I had already spun up and started configuring my new site, I decided to press onwards. My task was made considerably easier by my being able to access WP Clone on the original site, which let me move everything from my old site to the new one in just a few minutes. I redirected the nameservers to DigitalOcean, and ran a few last checks before calling the bulk of my work done.
The next day, when I was tidying up some loose ends and preparing to get SSL set up, I realized that my email no longer worked– my email server resided on the same server that hosted my old website, which meant I needed to find a new solution.
And we're mostly back online. Email is still acting up, but we're close to being done.
While I have been meaning to setup my own email server sometime soon, I wasn’t confident in my ability to get it up and running quickly, and email is one of those vital services I depend on working 100% of the time. In years past, I would have simply used Google Apps[3. Which is now G Suite, but that sounds silly.] to host my email, but that is no longer the free option it once was. Luckily, I found a solution thanks to Ian Macalinao at Simply Ian, which is to use Mailgun as a free email server. Mailgun is designed to send out massive email blasts for major companies, but they also offer a free tier for people and companies that are sending out fewer than 10,000 emails per month. I send out a fraction of that number, so this was perfect for me (and their mass email prices seem quite reasonable, so I might even use them for that if the need ever arises). Ian handily provided a set of instructions for how to setup the proper routing, and, while some of the menu options have changed, I was able to get my new email up and running within a few minutes.
Well, I have email up and running, but ssl still isn't working. Going to give the letsencrypt tool another shot before trying something else
So I’d managed to get both the site and my email up and running, but I still couldn’t get SSL up and running. For those that don’t know, SSL stands for Secure Sockets Layer, and it’s what powers the little green padlock that you see on your address bar when you visit your bank, or PayPal, or this website. I wrote an explanation on how it works a while back, and I suggest checking that out if you want to learn more.
One of the benefits of hosting my website on a VPS is that I don’t need to use the major third-party SSL providers to get certificates saying my server is who it says it is; I can use the free and open Let’s Encrypt certificate authority instead. Unfortunately, I just couldn’t get the certificate to work correctly; the automated tool was unable to connect to my server and verify it, which meant that the auto-renewal process wouldn’t complete. I could have generated an offline certificate and used that, but the certificates only last ninety days and I wasn’t looking forward to going through the setup process every three months.[4. It’s a pretty straightforward and simple process, I just know that I would forget about it at some point, the certificate would expire, and the site would have issues. If I can automate that issue away, I would much rather do that.] I tried creating new Virtual Hosts files for Apache, my web server, but that just created more of a problem. Eventually, I figured out that I had misconfigured something somewhere along the line. Rather than try to figure out which of the dozens of edits I had made was the problem, I gave up and just reverted back to a snapshot I had made before starting down the rabbit hole.[5. Snapshots are essentially DigitalOcean’s version of creating disk images of your server. I absolutely love snapshots; they’ve saved my bacon more than once, and I try to always take one before I embark on any major system changes.] After reverting to back before my virtual hosts meddling, I was able to successfully run the Let’s Encrypt tool, generate my certificate, and secure my site.
…that was my own fault. Turns out I didn't actually need to rewrite all of Apache's virtual hosts, and I probably shouldn't have tried.
Have you ever heard of Hacking Team? It’s an Italian company specializing in “digital infiltration” products for governments, law enforcement agencies, and large corporations. Simply put, they sell hacking tools.
You might think, given their business model, that they would monitor their own security religiously. Last year, however, they were hacked. Majorly hacked. “Hundreds of Gb” of their internal files, emails, documents, and source code for their products were released online for all to inspect, as were their unencrypted passwords [1. By the way, here’s some advice: if you are in security (or anything, really, this isn’t security-specific) you should really make sure your passwords are more secure than “P4ssword”, “wolverine”, and “universo”. Use a passphrase instead.]. Also released was a list of their customers, which included the governments of the United States, Russia, and Sudan—the last being a country controlled by an oppressive regime that has been embargoed by the E.U. [2. As an Italian company, this means that they were technically violating the embargo.]
I was particularly struck by how they gained access to the network. According to Phineas,
Hacking Team had very little exposed to the internet. For example, unlike Gamma Group, their customer support site needed a client certificate to connect. What they had was their main website (a Joomla blog in which Joomscan didn’t find anything serious), a mail server, a couple routers, two VPN appliances, and a spam filtering appliance… I had three options: look for a 0day in Joomla, look for a 0day in postfix, or look for a 0day in one of the embedded devices. A 0day in an embedded device seemed like the easiest option, and after two weeks of work reverse engineering, I got a remote root exploit… I did a lot of work and testing before using the exploit against Hacking Team. I wrote a backdoored firmware, and compiled various post-exploitation tools for the embedded device.
Basically, to avoid detection, Phineas discovered a unique vulnerability [3. These unique vulnerabilities are called a “zero-day” in computer security circles, because the hackers find it before the company maintaining the software or device does— so once the company finds it, they have zero days to mitigate damage.] in one of their embedded devices (likely one of their routers), figured out how to use it to get into the rest of the network using that vulnerability, and then carried out the attack through that piece of hardware without anybody noticing. No matter your feelings about the attack, this is an impressive feat.
Today I reconfigured a server I maintain for the Office of Residential Life and Housing. It broke yesterday because of a database issue, but I’ve taken this as an opportunity to rebuild and improve it with an included email server. I have it mostly up and running now, but it’s been a long, slow process that took far longer than I expected it to (as a sidenote, this would have been far easier if the backups I had were up-to-date. Always check your backups!)
Building an email server is more difficult than I expected. I almost expected to just run sudo apt-get install postfix and have an email server up and running; sure, it would need some configuration, but I’d be able to start sending and receiving mail almost immediately. And yes, that might be true if I installed something like Mail-in-a-Box or iRedMail, but I decided that that was too easy, jumped into the deep end, and immediately started configuring a mail server using Postfix, Dovecot, MySQL, and Spamassassin (and would have been instantly lost if it hadn’t been for this awesome guide). So I spent twelve hours copying and adapting code to my purpose, rewriting databases, adding users, restarting when I messed up.
It was absolutely awesome.
There’s something about taking the blank screen of a terminal, typing commands across it, and making something work. When you reload the page and it actually works the way you want it to, there is an immense feeling of satisfaction and accomplishment. You took something that was blank and empty, and turned it into something useful. There’s no feeling quite like it in the world.
That said, I’m totally using one of the ready-to-deploy email servers next time. Making something work is fantastic when you have the time to do that, but sometimes you just really need to have whatever you’re working on to be up and running.
Does risk play a factor in the housing market? Obviously it does—if it didn’t, Detroit wouldn’t be in trouble right now, prices in high-crime areas would be the same as everywhere else, and we’d have subdivisions built on top of active volcanoes (at least in places other than Hawaii). But does perceived risk have an impact on housing prices? Obviously not in every case—if it happened all the time, California’s earthquake risks would result in a very empty state, or very low housing prices—but what about with something that is generally perceived by the public as unsafe, despite a pretty good safety record? For example, what about with nuclear reactors?
I chose this topic because I’ve had an interest in nuclear development and nuclear power for many years, and I already had a decent grounding in the history and background. In addition, nuclear power is often portrayed by the media and Hollywood as unsafe, despite a stellar safety record in the United States, making it an ideal candidate to see if perceived risk could influence housing prices. This was something that I’d been wondering about since starting my economics degree, so I was thrilled by the fantastic opportunity to actually research this by mapping data and analyzing the results.
I began with some basic research on whether housing prices could be affected by risk. One paper I found, Does a Property-Specific Environmental Health Risk Create a “Neighborhood” Housing Price Stigma? Arsenic in Private Well Water, written by Kevin Boyle, Nicolai Kuminoff, Congwen Zhang, Michael Devanney, and Kathleen Bell, studied the impact of arsenic in the well water of two towns in Maine. They found that housing prices had been significantly depressed after the discovery of arsenic, though the effect lasted a mere two years. They contrasted that with numerous other studies that focused on Superfund cleanup, where the effects of contamination on housing prices could lower housing prices in the surrounding areas for decades following the successful cleanup of a site (Boyle et al.).
Two other excellent papers that discussed the reduction of property values were Estimates of the Impact of Crime Rates on Property Values from Megan’s Laws (Linden and Rockoff), which discussed the effect registered sex offenders have on local property values, and Estimation of Perceived Risk and its Effects on Property Values (McCluskey and Rauser), which concludes that both media coverage and past perception of risk influence current risk perception—and that increased perception of risk lowers property prices.
When I walked away from my research, I was far more confident in my ability to draw some conclusions based on what the data said—people were just as influenced by perceived risk as they were by actual risk. But where would I 1) be able to find a reactor in a fairly risk-free area (or at least one perceived as being risk-free), and 2) be able to find the data I would need? After doing some searching, I decided to use the Palo Verde Nuclear Generating Station, for several reasons. Firstly, because it is in a fairly stable area—the Phoenix, AZ. area isn’t prone to violent weather, flooding, earthquakes, or any other sort of natural phenomenon that might influence perceived risk (they do have some water issues, but I assumed for the sake of this project that people don’t take longer-term risks into account when deciding where to live). Phoenix is regarded by many as a generally pleasant place to live, though perhaps a touch on the warm side. Secondly, because the Palo Verde NGS [1. Unless otherwise noted, Palo Verde Nuclear Generating Station, Palo Verde NGS, and PVNGS all refer to the same thing, which is the plant. “Palo Verde”, used by itself, refers to the nearby town.] is the largest nuclear generating station in the United States—if people are concerned about living next to a nuclear generating station, they’d likely be most concerned about living next door to the largest NGS in the United States. And thirdly, I chose PVNGS for a much more personal reason: I’m originally from Phoenix, and I seize any opportunity to look at anything in the Phoenix area… especially during a chilly St Louis spring.
I also decided to focus my research within a ten mile buffer zone of PVNGS; nuclear plants such as the one at Palo Verde are required to have a ten-mile evacuation zone around them. Within this zone, warning sirens are tested regularly, radiation monitoring is conducted, and instructions for evacuation are distributed regularly. I assumed that this ten-mile zone would be where the dangers of the generating station would be most emphasized, and therefore where the largest effects on housing prices would be.
After I figured out what I was doing and where I’d be doing my analysis, I started looking for sources of data. I’d decided to look at both housing prices, and at household income. I found that Esri’s ArcGIS Online community had both in prepackaged datasets; unfortunately, they didn’t appear to allow me to view their underlying data, which made them useless for calculating population and estimating both housing prices and household income in a particular area. Since the easiest way to get my data (using Esri to do it for me) was out, I turned to the Census Bureau. I was focusing on housing prices and on household income, so I picked the American Community Survey as the best source of data that encompassed both, and I chose to use 2011 data because the 2012 data didn’t want to import into ArcMap properly on my computer.
In order to properly compare housing prices and income, I would need to have a community that was roughly comparable to the area around PVNGS, while being far enough away that it wouldn’t be influenced in any way by the nuclear plant. After looking at most of the small communities in Arizona that were between 40 and 60 miles from downtown Phoenix (Palo Verde NGS is approximately 50 miles from downtown Phoenix), I selected Coolidge, AZ., as my control community. It had roughly the same demographics, approximately the same population, and was located well away from PVNGS.
Next up was figuring out how many people lived near Palo Verde NGS, so I could double-check against the population numbers I had for Coolidge. This was originally a challenge, mostly because I wanted a fairly accurate number from what are essentially giant tracts of Census data (for the area I was in, the Census data was limited to tracts [or at least that was all I could find]). I checked to see if anyone else had had this issue before me, and I was rather surprised to learn that MSNBC investigative reporter Bill Dedman had written a very interesting analysis called Nuclear Neighbors: Population Rises Near U.S. Reactors, which included an analysis of population growth near nuclear power plants between 2000 and 2010. While the results weren’t directly applicable to my project, since the data was from a different year than what I was working with and involved different variables, a paragraph at the bottom of the article was: it noted that, because the population data was contained in census tracts, the data had been averaged based on how much of the tract was included in the buffer. A little more searching led me to instructions from Esri on how to do the same thing for the dataset I was using, and for the household income and housing prices values.
Of course, this also presented a challenge in how to interpret this averaged data. Ultimately, issues with the data I had averaged led me to drop household income from my dataset—the numbers I was getting for average household income were simply too low. It’s completely possible that the average household in some areas only makes $3 per year, but Occam’s Razor would suggest that I was experiencing errors with my data instead. I had hoped that I would be able to go back, figure out my error, and obtain the correct data, but I simply didn’t have enough time to do so. This is more than a little unfortunate, because I believe that household income data is tied closely to housing prices. Leaving income out of the analysis only paints part of the picture.
This actually brings up a very important point about the rest of my analysis; while the numbers I received for housing prices are quite reasonable, and the numbers I had from Coolidge were quite close to the numbers I found on the Census website, they are estimates based on how much property is near the plant, and assumes that every tract has a single average value. A manual review of the data shows property values that are quite sane, taking into account the terrain and population of the area, but it must be remembered that they are estimates based off of estimated data (since the ACS only surveys a fraction of the residences in an area, and uses their surveys to estimate values for the entire area). Unfortunately, without GIS data from the Maricopa County Assessor’s Office (which proved too cost-prohibitive to acquire), there’s was no method I could find that would have resulted in a better estimate.
Keeping all of the above caveats in mind, along with the fact that my sample size for this survey was one nuclear power plant and one control city, I can tentatively conclude that housing prices near nuclear reactors are lower than housing prices in comparable communities. This doesn’t really surprise me—my research into the history and development of nuclear power shows that the “not in my backyard” mentality (known by the fairly catchy acronym NIMBY) is extremely pronounced when it comes to the construction and operation of nuclear reactors—though I am surprised that it was as pronounced as it is. According to my data, median housing prices around Palo Verde NGS are around $13,000 lower than the median housing prices in Coolidge– $48,232 versus $61,285. I was expecting a gap of ~$5,000, not a gap of almost three times my estimate.
And given the expansion of Phoenix in the past ten to fifteen years, I conclude that prices were originally even lower and have been artificially inflated due to housing pressures; the nearby city of Buckeye, located about fifteen miles to the west, grew from a population of 6,537 in 2000 to 50,876 in 2010—a 678.3% increase in ten years. The associated housing shortage could have easily raised housing prices in the area by a significant amount, though I don’t have the required data to estimate by how much.
Unfortunately, I wasn’t able to perform the analysis I really wanted to perform. When I set out on this project, I had intended to show that the Palo Verde Nuclear Generating Station had depressed prices in the immediate area surrounding it, which, due to limitations of my data, I was unable to prove. I had intended to use Census data from the 1970, 1980, 1990, 2000, and 2010 Decennial Censuses, but unfortunately they either didn’t record the data I needed, or they were unavailable. I tried using the historical data from NHGIS, but I was unable to make it work the way I needed it to.
So, in the absence of the data that I couldn’t obtain, how could I have expanded this to increase the accuracy of my conclusions? It probably wouldn’t have been incredibly difficult to expand this to other nuclear generating stations in the United States (and, given enough of a break from school and work, I would love to do this). If I could show that the same effect was present near other nuclear plants, it would vastly increase the plausibility of my evidence. And controlling of local risk factors would probably be fairly straightforward as long as I was careful in my choice of control community.
In addition, getting the proper household income data would paint a much clearer picture of what is going on—knowing that housing prices are lower than in another community doesn’t mean too much if you can’t show that the income level in the first community is equal to or greater than that of the second community (or at least roughly comparable). In this particular case, I needed more granular data—data I couldn’t obtain without difficulty. And, in retrospect, trying to get the income data I did have to work consumed far too much time for the payoff that resulted. If I had known that the data wouldn’t work from the beginning, it would have been far better to have instead spent that time looking at other nuclear generating stations and their surrounding communities to expand my sample size beyond a single example.
“2010 Census Data – 2010 Census.” United States Census Bureau. Web. 5 May 2015.
“2011 American Community Survey Data.” United States Census Bureau. Web. 4 May 2015.
Boyle, Kevin J. et al. “Does a Property-Specific Environmental Health Risk Create a ‘neighborhood’ Housing Price Stigma? Arsenic in Private Well Water.” Water Resources Research 46.3 (2010): n/a–n/a. Web. 9 May 2015.
Dedman, Bill. “Nuclear Neighbors: Population Rises near US Reactors.” Msnbc.com. 14 Apr. 2011. Web. 9 May 2015.
Linden, Leigh, and Jonah E. Rockoff. “Estimates of the Impact of Crime Risk on Property Values from Megan’s Laws.” American Economic Review. N.p., 2008. Web. 9 May 2015.
McCluskey, Jill J., and Gordon C. Rauser. “Estimates of Perceived Risk and Its Effects on Property Values.” Land Economics 77.1 (2001): 42–55. Web. 9 May 2015.