Mountain Climbing

So I climbed an actual mountain yesterday. I don’t mean that I broke out ropes and pitons and scaled rock faces and scrambled up scree slopes. What I actually did was walk up  the Comsic Ray road trail on Sulphur Mountain at Banff. However, with an elevation gain of around 900 metres in the space of roughly 8 kilometres, that’s a slope of roughly 11% on average. Since the lower portion is much gentler on average, the upper portion has slopes much higher for portions. What I’m saying is that old road or not, it’s a real climb.

I have two handicaps for doing such a climb. One is that I am in lousy physical condition. I need to lose more than 100 pounds so that should give you some idea how things stand. The other handicap is that I suffer from acrophobia. This presents most inconveniently as unsteadiness on my feet near any sort of drop off in addition to anxiety.

There was nothing I could do about my physical condition. I did, however, bring a walking stick to help with the balance situation. It turns out that the stick was actually enough to make me feel reasonably stable, even to the point of being able to look out over the valley below at many points.

The weather was not conducive for strenuous activity for a couple of reasons. First was the heat – it was unpleasantly hot, though many folks reading this will not think that mid to high twenties (celsius) is unpleasant. For doing anything strenuous, it is. If that was the whole of it, things would not have been as bad, however. On top of the heat, there was a blanket of forest fire smoke over the area. That made breathing more difficult, causing respiratory irritation.

It took a fair time to work out an effective climbing strategy. Lower down, I tried walking at a more normal pace and very quickly realized that would not work. It took relatively little to bring my heart rate up to a worrying point. Slowing down helped some but a continuous pace at any speed still kept my heart rate at a worrying level. I ended up taking a number of breaks along the way, where I sat down on the trail looking down (it’s stabler that way) and waited until my heart rate slowed down. Eventually, I hit on a strategy that covered ground at a steady, if slow pace. I would walk ten or twenty steps and pause for a few seconds. I did the walk and pause process for more than half of the climb. This had a few beneficial effects. It kept my heart rate at a sane level, for the most part. It also reduced the overheating effect from the weather (which I think was contributing to the heart rate, in fact), and it kept my legs from turning to mush. As a matter of fact, except for being tired, I felt relatively good when I got to the top.

I ran into a couple of problems on my excursion. First was the insect population lower down the trail. In the well forested areas, mosquitoes and their ilk were thick as anything. My insect repellent said it was supposed to be good for 8 hours. Nope, not even close. I was lucky if it was still effective after one. I suspect a large amount of the problem was due to sweating.

The other problem is that I ran out of water about a kilometre or so from the top. I really should have run out a kilometre or so before that but I realized I was getting low and rationed it a bit. I was still producing sweat at the top, though, so I wasn’t so dehydrated that I was especially worried. Still, more water would have made that last kilometre much easier, if only for the benefit of clearing the crud from my throat. I started out with about 2 litres. I probably should have had 3 or 4 litres given the heat.

Now for the real question. Would I do this climb again? Not unless I lose substantial weight first. I would also seriously consider putting it off until cooler weather. In addition to that, I would also tackle shorter climbs until those were relatively easy to accomplish before taking on a climb like Sulphur Mountain. There are pleny of much shorter climbs of equivalent difficulty.

Overall, I’m happy I did it, though. The views, even with the smoke blanketing the area, were spectacular, espeically from higher elevations. The trail is also on the opposite side of the mountain from Banff so there is no indication of civilization to be seen. In fact, you can’t even see any indications of the old weather observatory on Sanson’s Peak or the gondola terminal until you’re nearly at the top.

I should mention there is another strail that zigzags up the face of the mountain beneath the gondola. The trail is somewhat shorter but the elevation gain is also something like 300 metres less as well. The scenery is not nearly so nice as the the back route, though.

If you choose to attempt this particular climb, I recommend getting on the trail at the bottom no later than noon. If you’re in poor condition, aim for closer to 10AM. You definitely do not want to miss that last gondola going down – if you do, you’re walking down. Obviously, going down would be a lot less strenuous than going up and will likely take a lot less time (probably on the order of 2 or 3 hours) but if you’ve already climbed up, you won’t want to do that. I also find that going down is harder on my knees so that is a consideration as well.

Generica – good or bad?

In recent years, there have been a number of successful Canadian shows that have achieved a broader appeal in the United States and possibly elsewhere. Those in the know will recognize the likes of Bitten, Orphan Black, Continuum, and Lost Girl. These shows all have Canadian locations, at least in part. Bitten clearly identifies Toronto, Continuum clearly identifies Vancouver, Lost Girl is fairly cagey on the issue, and Orphan Black specifically mentions places which strongly suggest Toronto and area. Bitten and Orphan Black have also specifically  identified locations in the United States.

Let’s look at the cases where there are locations in both countries. Nobody ever seems to have any difficulties crossing the border. There are never any border delays. They never need to be at the airport hours early. While border crossing can easily be handwaved away as happening during the boring part of travel, it does seem odd that there is no red tape preventing the likes of a convicted child rapist from crossing. Sure, such a person could just walk across the border in one of the many unguarded stretches, but that would take far longer to accomplish than is shown.

Leaving aside the border issues, there are other things that make people who know the score raise their eyebrows. In Orphan Black, for instance, characters are shown spending money that is clearly Canadian and driving cars with license plates that are correct for Ontario yet the police department carries badges that look more at home in the United States and their procedures are at odds with the known procedures in Canada. In Continuum, we have continual references to three letter agencies in the United States, but there have also been references to appropriate agencies in Canada. (To be fair, most Canadians would be somewhat familiar with the US three letter agencies so might generally mention them.)

What we have, then, is clearly not Canada as we know it. For the shows that use locations in the United States, there are probably also differences from reality, but those are probably less marked than the ones for Canadian locales. This is not surprising since audiences have great deal of exposure to the “television America” that has developed over the years on television produced in the United States.

What we have in the current crop of shows is a sort of amalgamation between the United States and Canada in a way that seems to be emerging as a sort of standard for an unspecified town somewhere in English speaking North America. This does make some sense given that the cultural differences between Canada and the United States are not so pronounced as many would like to believe. Sure, there are major differences, but there is a vast base of shared culture. This generic amalgamation has been called “Generica” by at least one commentator.

Generica has not shown up just in current sci-fi/fantasy television, either. For instance, the Good Witch series of movies is set in a generically named Middleton which seems to have many of the features of Generica. Leaving aside the inconsistencies in setting between movies, it’s not clear exactly where Middleton is, even with references to well known cities or locations.

So is Generica a bad place to use in fiction? Well, not really. For people not really in the know about things, they will never notice, and that is the vast majority of the world at large. For people ostenisbly living in Generica, it probably makes even more sense. Anyone who has studied North America at all will realize that there are huge cultural differences from region to region, even within a single country, province, or state. There are certainly enough procedural differences in civil institutions from state to state or city to city. When you add all this up, it becomes difficult to keep everything straight, especially in the face of changing demographics and regulations.

As long as the particular brand of Generica used in a particular universe remains internally consistent, there should be no strong objections to its use. Thus, as long as Orphan Black remains internally consistent, it should not matter that it takes place in a version of Generica. It is, after all, fictional. The same goes for any other show.

Generica has an added benefit. It is clearly not the world we live in. That means that the writers need not take particular pains to get everything exactly right for their setting of choice. Instead, they can concentrate on storytelling and largely ignore the inconvenient aspects of real jurisprudence or culture. As long as they do not claim to be reality, what harm is done?

Long live Generica!

Time as Distance…

It occurs to me that there is a particularly interesting phenomenon in many parts of the world when it comes to discussing distance. Have you ever had a conversation like this one:

A: How far is it to Fooburg?
B: About five hours.
A: Cool, thanks, man!

If you have, then you have directly experienced the phenomenon I am talking about.

This is extremely common in my neck of the woods. Yet if you asked someone to measure the distance, they would probably look up a distance chart and give you a number of kilometres or possibly miles. The notion of measuring distance in time units is, logically, considered dumb.

Then why do we use time units in a distance context? Well, a close examination reveals that it really isn’t a distance context. The specific context of the above exchange will usually be related to travelling. That relationship will usually be implied by context, but may also be explicit. In the context of travelling, it is usually how long it takes to get somewhere that matters the most, not how far it actually is. Obviously actual distance and mode of travel do matter, but the mode of travel is assumed for certain distances.

So why to we talk about time in a travel context? Because of the distances involved. In much of North America, distances between significant places (cities, etc.) can be measured in the hundreds of kilometres. Sometimes even in thousands of kilometres. The actual dominant impact of the travel is not so much the cost of the transportation but the time investment to actually do it. When things are close, we talk about a few miles or kilometres, but when things start getting into the nontrivial distances, we start talking about travel time instead.

When you consider the actual implicit context, it actually makes perfect sense. It only seems odd to someone who is not aware of that implied context. That is usually someone who has not experienced long travel distances on a regular basis.

 

The Intertubes Are Broken…

Some time overnight between April 3 and April 4, something happened on the Internet that created general brokenness. To make matters worse, it was inconsistent, or at least it appeared that way. At $dayjob, this led to all manner of weird complaints and irate customers demanding that we fix things.

The symptoms were quite variable. The first I was aware of it was when one of our internal processed was no longer able to access a remote API properly. However, it soon became clear that something much larger was going on as customer calls came in. I checked all the usual suspects and found nothing. I even checked the unusual suspects. Still nothing. I rebooted routers and servers on the off chance it was related to them. Still nothing. It was then clear to at least six nines certainty that the problem was not on my end.

Over the course of the day on Friday, I continued to investigate. I cranked up tcpdump and watched packet traces. That previously mentioned internal process was a useful troubleshooting point as it gave me something I could directly control and provided an easy means to trigger the problem on demand. Eventually, I determined that the remote TCP connection was establishing fully and the initial SSL negotiation was completing. However, as soon as real data started to travel back and forth, the connection would stall completely. I poked and prodded it from various angles and the symptoms remained the same.

Now as the day progressed, I also started seeing problems on my email server as inbound connections stacked up until they eventually filled all available connection slots. All were stuck either in STARTTLS or DATA state which are both points in an SMTP dialogue where nontrivial data begins to pass, the former being SSL negotiation, just like that internal service above, and the other being actual message data transmission. I also saw the occasional web connection stall on the web server.

Now I was certain what was going on. There was a link somewhere with a small-ish MTU that was not behaving correctly. It was now a question of where, and better yet, what to do about it.

Having established that it was not on my network, and doing another check of same now that I knew it was an MTU issue, I contacted my upstream provider. I rapidly got nowhere with them. I couldn’t even convince them there was a real problem. Even after I told them it was an MTU issue, they still continued trying to troubleshoot using ping and traceroute, neither of which is at all useful in this situation. Of course small packets from traceroute or ping are going to work. You need a large packet, MTU sized, to make it fail!

All the while I was getting nowhere with my upstream, I was pondering if there was a way I could work around it at my end. Finally, the obvious thing occurred to me. Why don’t I lower the MTU on my side and see what happens? I did that, and instantly, packets started flowing properly! So I did a quick convergence probe using different MTU values until I found the maximum one that actually still worked. I very quickly hit on 1492, just 8 bytes smaller than the standard ethernet MTU. Setting this on all the impacted servers eliminated the log jams and restored service to proper operations throughout.

Further research over the weekend suggests that the problem was showing on connections that traversed a major interconnection point in or around Chicago. That’s just speculation, however. It would explain why there were no apparent problems reaching some providers but other providers were essentially unreachable. The ones that were working generally traversed a western interconnection point or didn’t touch that particular area in the network topology. Unfortunately for us, a very large percentage of our traffic does cross that interconnection point.

It would be interesting to learn at some point just what caused the problem. I can think of a few mechanisms that do not rely on total incompetence or malice. However, I don’t expect anything to be forthcoming. In the end, I will probably leave the MTU where it is currently – it seems like it has reduced the general level of stuck connections overall which suggests there are many smaller edge cases with the same problem.

Prostitution in Canada

Recently there has been much talk about legalizing prostitution in Canada. It turns out that the actual act, exchanging money for sexual favours, has never actually been illegal. Instead, just about every possible means of agreeing to or supporting the exchange has been either outright banned or so severely curtailed that it might as well have been banned. Thus, for all intents and purposes, it has actually been illegal. However, a recent constitutional challenge was successful so all of that appears to be changing.

Personally, I think that prostitution itself should be legal. A lot of the naysayers like to employ bafflegab and other misdirection techniques to touch on the “ZOMG! The Children!” and “ZOMG! Slavery!” responses among the public. This is immensely disingenuous. Simply removing the barriers to prostitution itself does not automatically make slavery or child exploitation legal. The most recent luminary I heard conflating “prostitution” with “slavery” was none other than Calgary’s own Police Chief Rick Hanson. Shame on him! He should know better! In an interview on the radio, he seemed to imply that legalizing prostitution would somehow make child prostitution and slavery somehow more prevalant than it already is. This does not logically follow unless the police somehow magically stop enforcing the other laws against such things. After all, laws against sexual exploitation of children and slavery have not been ruled unconstitutional.

Rather than gnashing teeth and wringing hands about straw bogeymen, it would be far better to have a constructive examination of how to make the industry as safe as possible. Obviously there will be some sort of transition period during which the less savoury aspects of the industry will need to be rooted out. Also, the aspects of the industry that will continue to be illegal will still need resources directed at cleaning them up. However, without the consentual relations between adults sucking up resources, this might actually be a bit easier. Obviously, we will still need to enforce laws against various methods of forced exploitation, slavery, and most especially, exploitation of minors.

There is another aspect to legalization prostitution that is often largely ignored. If it is legal, it becomes practial to collect taxes for “professional services”. Yes, folks, that means income tax sales taxes. We can require proper regular medical checkups for workers in the trade and apply other health regulations.

Hopefully, also, by eliminating the “it’s illegal, you’ll go to jail if you complain about your lot in life” aspect, more of the victims of the unsavoury aspects of the current industry (those pressed into slavery by one means or another, those underage, and so on) will be willing to come forward. I have no illusions that there will be a stampede of such – fear is a powerful demotivator, but perhaps more will come forward before it gets to that stage. Remember, organized crime will still be crime and still be illegal!

All told, though, this change will not make the problem of human traficking or exploitation of minors worse. It will only bring the existing abuses into sharper relief as more attention is focused on them in the near term. This will give the false impression that the sitution is worse than it was. By removing attention from what are otherwise consentual acts between adults, more attention will be directed to the reprehensible abuses. That cannot be a bad thing.

Finally, I leave you with a question. Why is consentual sex for money (between adults) even considered a bad thing in the first place? The word “consentual” means, by definition, that neither party is being forced into the act!

libtool bogosity

Over the years, I’ve encountered a rather large number of packages that use libtool to construct shared objects of one kind or another. Granted, creating a shared library requires a bit of system specific knowledge since things behave differently with ELF or Win32 or <insert shared library system here>. Packaging some of that information up into a single easy to use tool seems like a brilliant idea on the surface. But it starts breaking down rapidly in actual implementation. As far as I can determine, libtool is about the least sensible implementation possible, at least the way it is commonly used.

The biggest issue I have with the way libtool has been used is that it doesn’t actually do the entire build process during the build process. When the build finishes, you do not have the shared object files in your build tree. Instead, you have “libtool object” files and possibly other files stashed away in weird places. No, the actual shared object is built during the install phase.

That doesn’t sound so bad, you say? Well, consider if I am building as a non-privileged user (as everyone should). That finishes and now it’s time to actually install the package. It is unavoidable that any install to a system location requires elevated privileges. So now, libtool gets at things and starts creating new files in the build tree, these files owned by root and constructed using linkers or compilers. That is, the very things that I shouldn’t be running as root are being run as root. This is particularly assinine given that there is no platform I am aware of that requires the final shared object to be built as root, nor does the final shared object depend on the state of the system at install time. It is perfectly possible to build the entire system, including the final shared objects, at build time. If something needs adjusting at install time, that is a clear problem with the rest of the build process.

Anyone who has had dealings with the auto tools from the GNU folks will not be surprised that libtool has such a deficiency. It is, after all, built using the same philosphy. The very thing that causes autoconf and automake to be so complex and fragile affects libtool. The tools try to support absolutely every system out there, including many dozens of systems that are obsolete and either haven’t been in actual use for decades or which can’t actually support many of the packages being built in the first place. For instance, you hardly need libtool to build a shared library for a system that doesn’t support them in the first place. If your package only supports modern POSIX systems using a typical Unix tool chain, you don’t need to know anything special to build a static library (build the object files and package them up with ar) and building a shared object is fairly straight forward too. You only need to support a small number of minor variants which can be trivially selected for with an environment variable, a special make target, or, gasp!, editing a couple of lines in a makefile.

My sincere request is for any package builders to consider carefully if their package really benefits from using libtool. If you don’t support multiple platforms, you do not need it. Period. If you aren’t building shared libraries, you don’t need libtool. If you are building shared libraries, odds are you aren’t doing anything that really needs it either save for being lazy and not bothering to look up how to create a shared library. If you are creating a new package today that is not intended to work on old systems (or which cannot for whatever reason), you don’t need libtool’s support for divining random features of early shared object implementations.

That said, you may find that libtool is actually useful for your project. Just try to convince it not to build any objects during the install phase.

Based on past experience with actually trying to use the GNU auto tools, watching libtool do weird stuff duirng build/install processes, and some minor attempts to use libtool for a couple of purposes, I have come to the conclusion that like autoconf and automake, libtool is largely a solution looking for a problem these days. It is often used for purposes where it makes no sense or it is used incorrectly. Also, like with autoconf and automake, the recommended methods of use only underscore the fact that the GNU project has long since lost touch with keeping things comprehensible and, instead, continues to pile complications on top of hacks.

Are Fiat Currencies Intrisically Bad?

There are a number of experts who insist that fiat currencies cannot possibly be a good thing because they have absolutely nothing backing them. That is, they have no objective value. Instead, they usually insist that something like gold or silver is better because it has retained value over the millennia. They insist that their substance of choice has an objective value. Continue reading “Are Fiat Currencies Intrisically Bad?”

Silver Screen Computer Interfaces

We’ve all seen them. Any movie or television program that uses computers in a manner even tangentially related to the plot will show some sort of interface. It may be the crazy 3D “I know this! It’s Unix!” interface from Jurassic Park or the magic progress bars for coyping or moving things (anything – files, money, etc.). Perhaps it was the crazy “hacker software” used by some in-story genius with deep magic. Or maybe it was the simple word processor with text big enough to read from across the room like in the opening credits for later seasons of Murder, She Wrote.

The earliest computers depicted with various whirlygigs and blinkenlights interpreted expertly by a smoking hot woman wearing an outfit whose practicality is dubious at best. (Or perhaps the technician was reading a paper tape.) These were actually not as far from the truth as later attempts. Early computers did not have display screens, after all.

With the advent of computers (or terminals) with character cell (and later bitmap) displays, there was much more room for interpretation. Now the computer could be communicated with in a manner that the audience might understand. Some movies, like Wargames actually treated things fairly accurately though the villain computer did have a much better grasp of English than it perhaps should have. At the time, most interfaces were text based. Only expensive installations had significant graphics capabilities. Wargames also avoids much of the idiocy with progress bars and the like.

On the other hand, you have movies like The Net. It makes an attempt to have a realistic interface system, but it promptly breaks what was actually possible at the time by having a visit to a web page cause an icon to appear outside of the web browser. It also employed the Magic Floppy™ which can somehow hold a very large amount of data. Still, if those were the only faults computer depiction had, most experts could live with it. After all, both of those things are, at least, theoretically possible with the right setting. (Say a modern SD card was packaged in a floppy form factor? Say the web browser integrated with the computer desktop?)

On the flip side, you have the database searches in just about every police prodedural produced in the past decade. The search process is always depicted as a linear process and it always seems to display every candidate match as it tests them, often going so far as to display the various things it is doing to do the matching. This is done for everything from fingerprints and facial recognition to scanning a hard drive for a local file. This is, of course, ridiculous. Doing so would slow the process down horribly. Of course, in most cases this can be handwaved aside as a sort of throbber, only showing the occasional candidate as an indicator that the system really is doing something. However, in some cases it is clear that every candidate is shown. Consider dialogue like “Go back three!”.

Another egregious sin is the Magic Progress Bar™. This is a progress bar that takes exactly as long as the plot requires to reach some particular plot important point, usually 100%. The Magic Progress Bar™ will often be applied to things that do not take appreciable time to actually process. When applied to file transfers of any kind, software installation, handshaking protocols, etc., the progress bar is defensible but it may still fall into this category. When applied to things like electronic funds transfer from one account to a single other account, a process which is instantaneous (as far submitting the processing goes – how long the actual transfer takes between institutions is more random and is usually measured in hours or days), a progress bar is simply stupid. I should note that there are cases where it would make sense in such a context – say when the funds are split across multiple accounts and are being transferred to multiple other accounts. In this case, a progress bar and/or a funds countdown might make some sense. After all, each transaction takes a non-zero amount of time.

In the middle of the road, neither good nor bad, are the operating systems and computer software that somehow magically know what the user intends and shows a massive screen-filling representation of whatever is convenient at the time. This can be explained as a mere narrative expediency in most cases. If it can be assumed for the purpose of the plot that the character would reasonably know how to do whatever action is being done, or reasonably figure it out quickly, there is no reason to waste time showing every minute step getting there. Instead, a short time rattling at a keyboard, clicking on a mouse, or similar is clearly sufficient. The display shown is clearly intended for the audience to be able to understand. These interfaces can provide anything from accounting or other business software to computer games to simple computer management and even hacking software.

For the most part, computer interfaces depicted on screen are getting more realistic, even if they do not match existing software. Where before situations like Wargames were the exception, more often the depiected interface is at least plausible. Sometimes it’s even a real one (if the purveyor of the interface paid for it, for instance). Still, some things still have far too much traction. For instance, the Magic Progress Bar™ and the “show everything we’re searching” notion. (Note that this is about interfaces, not capabilities so I do not mention things like infinite zoom/enhance.) Still, this stuff can often be overlooked if the remainder of the story or production is high quality. This is especially true as the egregious errors are becoming fewer, on average.

 

Ruminations on Writing: Archaic Forms

Something that is especially common in fantasy stories is the use of archaic pronouns and verb forms. This is usually accompanied by other archaic constructions. Commonly this is referred to as “old English” or “ye olde english” (pronunced “yee oldee english” which, incidentally, is incorrect). The errors are most commonly made by amateur writers but I have seen it from professionals whose work should have passed the desk of at least one competent editor. Continue reading “Ruminations on Writing: Archaic Forms”

Bitcoin as a Ponzi Scheme

I keep hearing claims from various souces that Bitcoin is a ponzi scheme. People claiming that clearly have no notion what a ponzi scheme actually is.

A ponzi scheme is one in investments are sought for a security that may or may not exist. The returns paid or promised for that security are above the actual market performance of the security. (Obviously, for a non-existent security, any rate of return would be above the market performance.) Early investors are paid out from the proceeds of selling shares to new investors. A “security” in this case could be anything from a company to a real estate deal.

As long as the amount being withdrawn or paid to investors is less than the amount being brought in by new investments, the ponzi scheme does not collapse. However, once payouts exceed new investments, the scheme rapidly collapses. The purveyor of the scheme will usually be long gone by this time, having taken his cut for brokering each transaction, or even having stolen the remaining assets in the scheme before it inverts and collapses, thus hastening the collapse.

It should be noted that in a good ponzi scheme, the early investors will, in fact, receive the promised return. This is critical to obtaining the word of mouth references that entice new investors in.

Bitcoin, on the other hand, does not qualify as a ponzi scheme. If Bitcoin is a ponzi scheme, then so is gold, oil, and frozen concentrated orange juice. The reason for this is that all four items trade on open markets. That’s right. If you happen to have a bitcoin, you can only sell it for what someone is willing to pay you. The same is true for an ounce of gold, a barrel of oil, or a pound of frozen concentrated orange juice.

I should point out that while Bitcoin or any other commodity or currency is not a ponzi scheme, that does not prevent someone from operating a ponzi scheme using Bitcoins (or oil or gold). However, just like in the case of gold or oil or US dollars, someone operating a ponzi scheme denominated in Bitcoins does not make Bitcoins themselves a ponzi scheme.

To make a long answer short, Bitcoin itself is not a ponzi scheme. It is merely a commodity (or currency, depending on your perspective) that trades on open markets with a free-floating price. You can buy at any time and sell at any time. The key is to deal with a reputable trader with proper infrastructure and security precautions. Just like you can make bad investments with US dollars, you can make bad investments with Bitcoins. Just like you can put your US dollars in a bank that fails due to fraud or incompetence, you can entrust your Bitcoins to a shady or incompetent operator.

Also, you should realize that just like any other free-floating price, it can be manipulated by legitimate and not-s0-legitimate means. There are no guaranteed returns. If anyone tells you that you can “get rich quick”, run away, whether it is Bitcoin or something else. Sure, early investors in Bitcoin are potentially sitting on massive Bitcoin wealth. But just because someone bought low (or at zero in the case of “mining”), it doesn’t mean it is a ponzi scheme. There is nothing stopping you from risking an amount of your currency of choice attempting to buy low and sell high even today, just like countless people are doing with gold, oil, frozen concentrated orange juice, or any other commodity or currency!

I can suggest a Bitcoin exchange that seems to be trustworthy if you wish to convert between Bitcoins and CAD. Check out CAVirtEx. They seem to have their ducks lined up and they take great pains to make sure they know what is going on. They also have fairly solid market depth. They are run by real people, are a real corporation, and have registered with the relevant authorities for providing fincancial services. While none of that guarantees they will suffer a similar fate to other exchanges, it does put them on a fairly solid footing.