Summerland Expedition

On Thursday, my dad and I set out on a trip to Summerland,  British Columbia to observe and ride the Kettle Valley steam train. So far, the trip has been enjoyable and quite successful.

A couple of things have stood out so far.  The first was the stop at the spiral tunnels on the way over the Rockies.

The other has been a restaurant we happened on in Summerland.  It is called Santorini’s and it is well worth your while to stop there if you ever find yourself passing through Summerland. It is located in the central business district so it requires navigating off the main highway but that extra effort will pay off. Not only is the service excellent but so is the food, from the regular menu to the daily specials. One particularly interesting (in a good way) item was the southern fried chicken served on a waffle topped with gravy and with maple syrup on the side. You are probably thinking that there is no way that could possibly work but, surprisingly, even with the syrup and gravy mixed on the waffle after eating the chicken it worked quite well. If they happen to be offering that special when you stop by, give it a try.  You will be pleasantly surprised.

Those of you who follow my ramblings know that I rarely review or endorse businesses so you know I was truly impressed.

Of course, Summerland has a great deal more to offer. Make ab expedition of it and view the other attractions in the area like the local mountain or the many wineries. Or just enjoy the spectacular scenery the Okanogan has to offer.

 

I now return you to your regularly scheduled randomness.

 

 

 

 

 

 

 

 

WTF, CP?

Anyone who isn’t living under a rock knows that Calgary has experienced an unprecedented flood on both the Bow and Elbow rivers. While the water is down to manageable levels now and cleanup is proceeding at a staggering pace, the state of emergency persists and a large chunk of the downtown core is still without power, not to mention low lying areas outside the core.

In the wake of all this, at roughly 4:00 this morning, some genius at Canadian Pacific Railway thought it would be a good idea to run a loaded freight train across a bridge that is well over a century old (probably over 125 years). Ordinarily, this would not be a particularly dangerous thing but given the unprecedented flooding and the fact that the Bow river is still running very high, the logic of this decision totally escapes me. I suspect this is the sort of decision that controllers have gotten away with many times over the years with questionable structures surviving by pure fluke. Alas, that was not to be the case today. The bridge started to callapse as the train had mostly finished passing over the bridge. Clearly the bridge was not sound and from the descriptions of the failure, it sounds like one of the bridge piers was undermined and the weight and vibration of the train’s passage caused whatever was still holding the pier up to collapse. Of course, once that happened, the river flow would have ensured it continued to collapse.

So the question then becomes why did the city allow the bridge to be used? After all, it is within the city limits. Well, it turns out that the city has no authority over railroads at all. That’s right. Zero. None. The city cannot even enforce noise bylaws or bar trains from blocking intersections during rush hour. It further turns out that even the province can do nothing. Apparently railroads are only beholden to the federal government and its agencies. What that means is that the city had no authority or access to inspect any of the rail bridges or to bar the railroad from operating trains. Yet it turns out that within the city limits, the city is responsible for the safety and response to any problems caused by railroads.

So, not only is the city still dealing with the aftermath of an unprecedented flood, but it also has to deal with the aftermath of a bone headed decision by a flunky working for a private company over which the city has no authority whatsoever. Thanks to this #nenshinoun, the city has to divert resources from handling the flood cleanup to dealing with this secondary crisis.

It seems clear now that regulatory reform is absolutely required. Make railroads beholden to the same municipalities that other transportation companies are. Let the municipalities manage all infrastructure in their boundaries instead of everything except the railways. After all, municipalities are uniquely qualified to manage infrastructure in their geographic areas. Furthermore, allow the provincial transportation departments to enforce their regulations as well. No mroe of this incomplete oversight from the federal authorities who are either understaffed or simply not competent to the job.

Update 19:11. The bridge is actually 101 years old according to current news reports. It was also apparently inspected several times before the train was driven across it. It seems I was also correct in assuming that it was a failure at the bottom of a bridge pier, which, to be fair, there is no way they could have seen it in an inspection. However, since they apparently didn’t even know that the neighbouring bridge was not connected at the foundations, it is clear that they should not have been opening the bridge until they could inspect the foundations. After all, if you don’t even know what is connected together, how do you know the foundations of the bridge are still sound? Calgary was able to have some certainty about its bridges because they are anchored into actual bedrock. Any bridge not so anchored probably should be considered suspect after such a flood as we have had.

“Personal” Computing Long Term

It may be inconceivable to many people today, but the era of everybody having personal computers with massive capabilities sitting around mostly idle to read email or write a letter to grandma are numbered. The vast majority of people do not need a full blown totally independent computer. They don’t do anything remotely complex or out of the ordinary on it. Or, if they do, they don’t do it often enough to really warrant having such a resource intensive device.

As technology develops, it becomes clearer and clearer that most people just want to get their email, write their letters, and maybe balance their chequebooks. Very few people need that two dozen core monster with four ganged high power video cards connected to more screen real estate than Times Square. If you need it, you know it, and you likely know what you’re doing and what you’re going to pay for it. For the rest of the world for whom tablets or smartphones seem to be sufficient, any sit-down computing they do at a keyboard and monitor certainly does not need even the low powered computers currently available.

Now suppose that instead of every person having a desktop computer with monitor, keyboard, mouse, etc., plugged in and sucking up power, most people had, instead, a much lower power device more akin to the processing power in a high end smart phone attached to a reasonable sized screen, keyboard, and mouse. Even that, alone, reduces the power footprint noticeably and if you multiply that by hundreds of millions, you’re suddenly talking about real energy reduction.

Now suppose you take some of that energy reduction and budget that for larger centralized processing resources. Make those resources available on a usage basis to people who need something more than their TinyBox™ device occasionally, say to render an edit of Christmas at Uncle Fred’s, or whatever. Or maybe they need to store some files that are larger than the TinyBox™ can handle, or maybe they want to be able to access them from other TinyBox™ devices.

Sure, it sounds like we would rapidly get back to the same energy consumption we had before. However, once the installations reach a certain usage level, something magical happens. The actual required resources to service the needs of the users stops rising linearly and starts to look more like a logarithm. This is the so-called ecnomy of scale graph. The reason this works is that as the number of users increases, the likelihood of too many users needing resources at exactly the same time decreases. As a result, the total computing power required for all their computing needs tends to reduce. If the controller software further prioritizes particular types of work loads, the overall load can be spread around through off-peak time, thus reducing the overall peak further.

Of course, all of this will require a major paradigm shift from the users and it will require solid infrastructure from the providers of the centralized infrastructure. Additionally, it will likely only ever be viable in areas with high population density like city centres or highrises. Still, it will come as energy budgets become tighter. In fact, ultimately, the end result will likely be that any portable device one has will interface directly with the chosen centralized resources and even having a fixed display and keyboard installation will likely become much less common.

Yes, people, I know I have just described “the cloud”. While I don’t see how “the cloud” as implemented today is at all a good idea, ultimately, it will happen and it will have a net beneficial effect on the environment (so-called “emissions” – there’s more to an environmental footprint than just CO₂, people!) when it gets here.

 

Urban Planning for the Long Term

I recently had the opportunity to learn about some of the future plans for LRT expansion in Calgary. Ordinarily, that wouldn’t be anything to write home about. In this case, however, the city is seeking input on which route to take through the areas that were developed before rights of way were protected for future LRT development. Thus, we have the usual wrangling and nimby. Again, nothing spectactular there except for one thing. One of the proposed routes goes through a natural area while the other two disrupt existing urban development, notably two of the busiest streets in the city.

My original knee-jerk conclusion was that the natural area option (running along an existing heavy rail line) was the best because it didn’t disrupt existing traffic patterns on very busy roads. Roads, I might add, that I used to use regularly and, thus, I understand clearly the impact of reducing road capacity on travel on those roads.

After studying the situation for a while, however, I came to a very different conclusion. The natural area is simply too far from the actual development for people to bother using it. Proponents of the option are quick to say that feeder buses will solve that, but I live in an area that has only feeder service. It is so inconvenient that I choose to drive even when transit would be a better choice and I am more predisposed to taking transit than many. That means disrupting one of the busy roads is a better option. Of the two, it turns out that the busiest road (where most of the buses currently run) is the best option. After all, the buses run on that road for a reason.

Clearly reducing road capacity on a major arterial connection into the downtown core is problematic in the short term. Even with the traffic eliminated by the LRT, which should be substantial for it will replace a lot of bus traffic (well over 1000 per day according to the information I have, on a typical four lane urban street), there will still be a large volume of traffic on that road. Some will displace to neighboring roads which should also see a corresponding decrease as a result of the LRT so it may not be nearly so bad as it could be, especially if the remaining capacity of the road is designed sensibly (with appropriate turn bays). So, really, it’s probably not nearly so bad an impact as the knee-jerk assessment suggests.

There is, however, another important factor to consider. Once this line is built, it is built. It is unlikely that the resources will be available to relocate it. It will likely be in service in a century or two so long as the city remains. But what will the city look like more than a century in the future? Most likely, it will look very much like it does today with one notable difference. There will be a great deal less automobile traffic and a great deal more localized travel. The same factors that are influencing localization of services now will only intensify as automobiles become more and more expensive to operate as the resources to produce and power them become more expensive. For those who believe the electric vehicles will solve this problem, consider the expensive of producing batteries and also the required infrastructure to support charging them. And even if electric powered vehicles do extend the personal automobile horizon, it will, eventually, come. And, if by some miracle that horizon fails to materialize, would we not all benefit from less automobiles on the road through increased safety and reduced air and noise pollution?

So it seems that by planning for the ultimate future with very little automobile traffic within the urban area and building infrastructure with that in mind now, our future infrastructure costs can be reduced. Also, we can begin to encourage the change that will come eventually to happen sooner, and thus we begin to benefit from the change sooner. This does, however, require a paradigm shift in urban planning – a shift away from planning for automobile traffic and toward planning for pedestrians and non-automobile traffic.

To make a long story short (I know, too late!), I have recently come to the conclusion that we should simply not be bothering to accomodate personal automobile travel but, instead, focus heavily on mass transportation system and making it convenient for pedestrians and cyclists to travel where they need to. Unfortunately, due to existing economic realities, it is not practical to do so in many areas of the city and this is where we need a massive paradigm shift in the planning processes.

ITU and the Internet

Lately, there has been a bit of noise about the ITU (International Telecommunications Union) taking over the governance of the Internet. On the surface, that sounds reasonable. After all, the Internet is about telecommunications, isn’t it. But is it really a good idea? CIRA has a decent write-up on why it might be a bad idea and a link to a petition against it. I will avoid repeating the history of the ITU and other bits of information that can be easily discovered elsewhere.

Instead, I will add my voice to the many who believe there is no need for the ITU to be involved with the Internet. The current model of multi stakeholder agreements seems to work quite well. Obviously, totalitarian regimes, for instance, continue to impose censorship and other restrictions on the Internet in their jurisdictions. However, under the current model, those interests cannot impose their will on any jurisdiction that does not choose to allow it. Do those of us living in the more enlightened countries in the world wish to have our communication options controlled by a group which is beholden to dictators and totalitarian regimes? How about genocidal ones? Do we want our costs for communication to go up due to treaty enforced tariffs on Internet traffic? How about mandatory censorship of anything objectionable to Islam? Or China?

In short, I do not see how putting the ITU in charge of any aspect of the Internet is an improvement over the current model. While it may not necessarily be any worse, it is unlikely to be better. That would mean that making a change would simply be change for change’s sake! That is never a good reason to change anything.

 

Writing Reasonable PHP

PHP gets ragged on a lot for various reasons. One of the biggest complaints I see is that PHP is “insecure” as if writing bad code in PHP is somehow PHP’s fault. The other major complaint is not so much a complaint against the core language as against the standard library and runtime environment and refers to the chaotic nature of the standard functions in particular. Complaints about the latter have merit but PHP is far from the only popular language to have that problem. The former might have some merit but it is just as ridiculous as blaming C because programmers write buffer overflows. It is not strictly PHP’s fault when programmers do stupid things. Granted, PHP makes a lot of stupid things very easy and some of the early design decisions for the PHP runtime environment are questionable in hindsight, but writing sensible PHP code is not impossible or even especially difficult.

Types of PHP Code

Before I delve too far into the intricacies of PHP, let me touch on the types of coding that PHP can be used for.

PHP was designed (or evolved, really) as a means to enhance largely static web pages. It fit into the same niche as Microsoft’s active server pages. It was designed to make adding a small amount of dynamic content to an otherwise largely static page easy. While this is still common today, it is no longer the primary use case. This is also the reason for a lot of the somewhat questionable design decisions for the runtime environment (such as the ever popular and justifiably maligned “register_globals” feature).

As it gained popularity, it began to edge out the use of CGI scripts written in perl or other languages. This was partly due to the complexity of dealing with CGI on most servers and partly due to the fact that PHP itself handled all of the boilerplate stuff needed to deal with the CGI interface – decoding script input primarily. Thus, PHP scripts moved more toward being PHP code with HTML content embedded in it instead of HTML code with PHP embedded in it. Some of the more unfortunate design decisions were addressed at this point (during the 4.x series), including the “register_globals” problem, with the introduction of the “superglobal” arrays and a few other things. PHP also gained a sort of object orientation and a massive collection of “extensions”, many of which are bundled and/or enabled by default. This type of coding is the most common today – programs that are still intended to run in a web server environment and resemble the classic CGI script more than the classic “active page” model.

Finally, PHP gained a command line variant. With a few tweaks to the runtime environment, it became possible to write programs that do not depend on the presence of a web server or the CGI interface specification. Most of the historical runtime design issues do not apply to a command line PHP program. However, the source format remains the same including the PHP open/close tags.

A Sensible PHP Environment

A great deal of sanity can be obtained before a single PHP statement is written by setting up the environment in a sensible manner. Most of the features of PHP that are maligned (often justifiably) by critics can be turned off in the PHP configuration file. Notably, one should turn off register_globals, all magic quotes variants, register_long_arrays, allow_url_include, and allow_url_fopen. There are other configurations that make sense to disable too, depending which extensions you are using.

It should be noted that disabling some of these settings makes coding less convenient. However, often the convenience comes at the cost of clarity or even security.

Writing PHP Code

Most of the recommendations here apply to all programming languages. Let me stress that. Writing good code requires discipline in any language.

Check Inputs

One of the biggest sources of problems with any program is failure to check input data. Anything input by a user must be viewed as suspect. After all, the user might be malicious or simply make an error. Relying on user input to be correct is never the right thing to do. Steps must be taken to ensure that bogus input data does not cause your program to misbehave. Inputs that cannot be handled should produce error conditions in a controlled manner.

Many programmers do grasp this concept intuitively. Input checking code is often present when handling direct user input. However, most overlook the simple fact that data coming from anywhere outside the program code itself must be treated as suspect. You cannot be certain that what you wrote to a data file is still in that file. It could have been corrupted by a hardware failure, user error, or the file could have been replaced with another type of file, all without your program being aware of it. The same applies to data stored in a database system like MySQL or in a session cache or a shared memory cache somewhere.

The advice here: Verify everything. Failure to correctly do so  is not a weakness in PHP but in the programmer. It is also the single largest source of security problems. Careful adherence to this principle will quickly yield much better code.

Check Returns

Closely related to the previous item, and high up on the list of programmer errors, is failing to check return values from function calls. Most library functions will have some sort of return value. For functions that can fail for whatever reason (bad parameters fed in, external state, etc.), it is absolutely critical to check for those failure conditions and handle them in a manner that is appropriate for your program. These conditions can be as simple as a data file being missing or as complicated as a remote socket connection timing out or the database server going away.

Study all function calls you use and make certain you understand what failure conditions exist. If a failure condition will cause your program to fail or otherwise misbehave, handle it. If a failure condition is impossible, it is doubly critical to handle it. That said, if a failure condition will not cause your program to misbehave or otherwise fail, it can be ignored, but make absolutely certain that is the case and document why.

The advice here: Always check return values.

Protect Output

This one is a lot less obvious and is best explained by example. Suppose you are outputting some text into an HTML document and you do not know in advance what characters that text contains. In HTML, some characters have special meanings (such as quotes) but are also valid in actual text. These special characters have to be protected in a medium appropriate way. In the HTML case, they would be replaced with appropriate entities. This is a common case in PHP programming but it is not the only one. The same applies when passing data to a database system like MySQL using SQL or when passing command arguments to an an external program. Failure to protect output properly is the leading cause of a class of security vulnerabilities known as SQL injection attacks. There are analogs for other output streams too. Sometimes the corruption of the output stream is mostly harmless like when an unprotected comma is inserted into a CSV field in an informational spreadsheet. Other times, it can cause cascading failures or even allow clever attackers to obtain private data.

The advice: Always protect output, no matter where it is destined.

Use Correct Operators

This is more specific to PHP but there are similar situations in other languages. In PHP specifically, there are two equality and two inequality operators. One set does loose type handling and attempts to find some means to compare its operands to the point of doing type conversions behind the scenes. The other set will fail if the underlying types of the two operands are different even if the apparent values are the same. The “==” and “!=” operators are the first set and “===” and “!==” are the second set.  Using the former, the string “0” and the number 0 will compare as equal while with the second they will not. This is important because many functions will return “false” on an error but some other type (like a number) on success. If you use the loose comparisons, “false” and “0” are equal but they are not with the strict comparisons.

PHP also has a number of functions which can be used to identify NULL values, arrays, and so on, which can also be employed when the type of a value is important.

In most cases, the strict comparison operator is probably the better choice but the loose comparison can be useful. In short, write what you mean using the correct operators. Make sure you know exactly what the operator you choose is doing.

Using Language Constructs

Like any programming language, PHP has a number of language constructs that are very useful but there are other ways that similar effects can be achieved. For a trivial example, consider the use of a long “if/elseif/elseif/else” structure comparing a single variable against a series of values. This can also be expressed using a “switch” statement. In this trivial example, either one is valid and is about equivalent though the “switch” statement has a few features that might make it more useful in some circumstances. Likewise, a “for” loop can always be faked using “while”.

On the other hand, there are cases where an alternative is not equivalent. Consider the case of “include/require” vs. a function call.. While the fact that you can include the same file in dozens of different places looks a lot like a function call, and can often be used for a similar effect, it is not the same thing. The included code runs in the same scope as the location of the include directive, for instance, which means that any variables in the including file might be scribbled over by the included file. Parameters also must be passed in variables and return values returned the same way. It is also not possible to use such a “function” recursively. On the other hand, an actual function call gains its own local variable scope, preventing the function from clobbering variables in the caller, and also has a formalized parameter list and return value. Furthermore, functions can be called recursively which is also incredibly useful. Thus, it is important to use the right construct for the job. “include” is not the right construct to execute a chunk of code from random locations. (I have singled this particular one out because it shows up far to often in PHP code.)

The advice: use the right language construct for the job. This applies not only to things like “include” but also to things like objects. Creating an object to “encapsulate” a behaviour adequately described by a single function is just as silly as using “while” to simulate “for”.

Wrap Up

The preceding is, by no means, exhaustive. However, by following the above recommendations, it is possible to write reasonable PHP code. All it requires is a bit of discipline and an understanding of the language you are using.

I should note that this is not an apology for PHP but merely a set of suggestions to avoid writing bad code. Remember. Just because PHP allows you to do something in a particularly unfortunate way, it does not mean that you have to do it that way. If it looks like a bad way to do things, look for a better way. Odds are pretty good you will find one.

 

Ruminations on Legal Systems

I have had occasion to ponder the basis of the legal systems used in many Commonwealth countries and also in other former British colonies. This basis is often called “common law”. Common law is basically law as defined by decisions made by courts and similar bodies which enter into the system as precedents. These precedents then have force of law until countermanded by a legislative action or further precedent. The alternative is that all laws must be made by a governing body such as a legislature of king. Upon reflection, it is not clear to me that common law is necessarily a good solution

Continue reading “Ruminations on Legal Systems”

Climate Change and Heat

It’s currently the “in” thing to talk about anthropogenic global warming (AGW), which is the notion of global warming being caused by human activity.  Whether AGW is real or not is not the point of this post, however. Neither is debate over whether “climate change” (a term usually conflated with AGW in popular culture) is a bad thing or not. Rather, I’m going to consider a couple of mechanisms that might lead to the AGW effect. Continue reading “Climate Change and Heat”

Sustainable Settlement

Sustainability is the buzzword of the day. Everyone wants sustainability. But somehow, everyone seems to miss the point of sustainability. Have you heard a policy maker talk about “sustainable growth”? That’s utter nonsense. Anyone putting a bit of thought into the matter will realize that growth cannot be sustainable indefinitely. After all, there is only a finite set of resources available to fuel it. Leaving aside systemic biases toward perpetual growth, however, let’s muse about what a sustainable settlement on any planet would need to look like.

Continue reading “Sustainable Settlement”