My Bitcoin Problem

I didn’t get enough of them…. ?tulip-fever-movie-poster-e1505608260306

Back in the good old days, Hikvision NVRs part of an exploit that was used to mine Bitcoin, naturally, that was back when Bitcoin was used primarily to buy heroin and weapons via the darknet. Today, though, everyone and their dog is buying bitcoin like it was pets.com shares ca 2001,  and the hardware needed to mine coins today is a million times more powerful than a cheapo NVR.

First things first; why do we need “currency”. I think it’s worth revisiting the purpose, before moving on. Basically, “currency” is a promise, that someone (anyone) will “return the favor” down the line. In other words, I mow your lawn, and you give me an IOU, which I trade for some eggs at with the local farmer. The farmer then trades the IOU for getting picket fence painted by you (you then tear up the IOU).

Instead of crude IOU’s, we convert the work done into units of currency, which we then exchange. Mowing a lawn may be worth 10 units while doing the dishes is worth 5. In the sweet old days, the US had many different currencies, pretty much one per state. They served the same purpose. To allow someone to trade a cow for some pigs and eggs, some labor for food, food for labor and so on.

But pray tell, what politician, and what banker would not love to be able to issue IOUs in return for favors, without actually ever returning them?

Since politicians and bankers run the show, naturally, the concept got corrupted. Politicians and banks started issuing IOUs left and right, which basically defrauded you of your work. When you mowed the lawn on Monday, you would expect that you could exchange the IOU for a lawn mowing on Friday, but with politicians producing mountains of IOUs, you suddenly find that the sweat off your brow on Monday only paid for half the work on Friday.

This is classic inflation.

By the same token, it would be one hell of an annoyance if you mow my lawn on Monday, and now, to repay you, I would have to not only mow your damn lawn, but also paint your fence on Friday.

This is classic deflation.

What you want is a stable, and fair currency. That work you do on Monday can be exchanged for an equal amount of work on Friday.

You can then wrap layers of complexity around it, but at its core, the idea is that money is a store of work, and that store should be stable.  The idea that we “need 2% inflation” is utter nonsense. In a democracy, the government can introduce a tax on cash equivalent holdings if the voters so desire. This would be more manageable and precise than senile old farts in central banks trying to “manage inflation” by purchasing bonds and stock, with the predictable side effect that it props up sick and useless companies. The idea that you can get work done by just shuffling some papers around is an abomination in my book.

Bitcoin is an attempt at creating a currency that can’t be manipulated by (presumably corrupt or incompetent) politicians and bankers, but I think they’ve gone far, far away from that idea.

The people who are engaging in bitcoin speculation are not doing it because they want a fair and stable store of work (having discarded traditional fiat currency as being unstable and subject to manipulation). Instead, they do it, because, in the speculative frenzy, bitcoin is highly deflationary. You can get a thousand lawns mowed on Friday for the lawn you mowed on Monday. As a “stable currency”, Bitcoin has utterly failed. And we’re not even discussing the transaction issues (200K back-logged transactions, and a max of 2000 transactions every 10 minutes).

This happens because bitcoin is not a currency at all. It’s a simply the object underpinning a speculative bubble. And as it happens with all bubbles, there are people who will say “you don’t understand why this is brilliant, you see… ” and then a stream of illogical half-truths and speculation follows. People share stories about how they paid $100 for a cup of coffee 12 months ago when they used bitcoin to pay for it. But a cup of coffee in dollars cost about the same as it did 12 months ago, so while the dollar is being devalued by very mild inflation, and thus a much more stable store of work, bitcoin is promising free lunches for everyone.

People, for the most part, take part in this orgy with the expectation that at some point, they will settle the score for real currency – real dollars. Very few (and I happen to know one) will keep them “forever” on principle alone.

Furthermore, I don’t see any reason why the Bitcoin administrators wouldn’t just increase the self-imposed 21 million coin limit to 210 million of 2.1 billion coins. They already decided to create a new version, called Bitcoin Cash that essentially doubled the amount of bitcoin. That and the 1300 other cryptocurrencies out there makes it hard for me to buy into the idea that there is a “finite number of coins”. Not only that, to increase transaction speed to something useful, they are going to abandon the blockchain security, opening up for all sorts of manipulation (not unlike naked short selling of stock etc.)

And let’s not forget that before Nixon, the civilized world agreed to peg currencies to gold (a universal currency that could not be forged). In 1973, Nixon removed the peg from the US dollar and since then the number of dollars has exploded, and the value has dropped dramatically. In other words, what was a sure thing pre-1973, was suddenly not a sure thing.

This is not investing advice. You might buy bitcoin (or other crypto-“currencies”) today, and make 100% over the next few weeks. You might also lose it all. I would not be surprised by either.

 

Net Neutrality

You can’t be against net neutrality, and, at the same time, understand how the Internet works.

There is no additional cost to the IPS to offer access to obscure sites; it’s not like a cable package where the cable provider pays a fee to carry some niche channel that no-one watches.

Basically, net neutrality means that the ISP has to keep the queues fair; there are no VIP lanes on the Internet. Everyone gets in the same line, and are processed on a first come, first served basis. This is fundamentally fair. The business class traveler may be angered by the inability to buy his way to the front of the line (at the expense of everyone else), but that’s just tough titties.

It’s clear that not everyone has the same speed on the Internet; I live in an area where the owners association decided against having fiber installed, so I have a shitty (but sufficient) 20/2Mbit ADSL connection. My friend across the bridge, in Sweden, has a 100/100Mbit at half the cost. But that has nothing to do with net neutrality.

If my friend wants to access my server, my upstream channel is limited to 2 Mbit per second. This is by my choice, I can choose to host my server somewhere else, I could try to get a better link and so on, but basically, I decide for myself who, and how much I want to offer. There are sites that will flat out refuse to serve data to certain visitors, and that’s their prerogative.

However, with net neutrality removed, my site may get throttled or artificially bottlenecked to the point where people just quit visiting my site. I would have to deal with several ISP’s and possibly have to pay them a fee to remove the cap. If the site is not commercial* I may not have the funds to do that. I may not be aware that an ISP is throttling my site into oblivion, or even be offered an option to remove the cap.

Clearly, ending net neutrality is not the end of the world. Guatemala and Morroco are two examples of countries w/o net neutrality. In Morroco, the ISPs decided to block Skype, since it was competing with their (more profitable) voice service, so that might give you a hint of what’s to come. They did complain to the King when the ISPs went too far though.

Naturally, fast access to Facebook LinkedIn and Snapchat might be cheaper, and probably all you care about if you’re against NN.

With cloud-based IP video surveillance starting to become viable, this might prove to be another, unpredictable cost of the system. Some ISPs already take issue with you hosting a web server via your retail connection. And they go out of their way to make it difficult for you to do so: Changing your IP address every 4 hours and so on. This is to push you into a more expensive “business plan”, where they simply disable the script that changes your IP. I think it is safe to assume that if you’re streaming 30 MBit/s 24/7 to an Amazon data center, the ISP will eventually find a way to make you pay. And pay dearly. Once you’ve hooked your entire IP video surveillance system into the cloud, what are you going to do? Switch to another ISP? #yeahright

I guess the problem is that the ISP business model used to be to sell the same bandwidth 100 times over. Now that people are actually using the bandwidth, that model falls apart, and the ISPs need other means to make sweet sweet moolah. And that’s their nature and duty. But why cheer them on?

*In the early days, commercial activity on the Internet was banned.

 

HomeKit Flaw

https://9to5mac.com/2017/12/07/homekit-vulnerability/

Does this vulnerability shipping mean you shouldn’t trust HomeKit or smart home products going forward? The reality is bugs in software happen. They always have and pending any breakthrough in software development methods, they likely always will. The same is true for physical hardware which can be flawed and need to be recalled. The difference is software can be fixed over-the-air without a full recall.*

*Unless it’s a Chinese IP camera, then all “mistakes” are deliberate backdoors put in place by the government.

Facts and Folklore in the IP Video Industry

A while ago, I argued that just because JPEGs took up more storage space, it did not mean that JPEG offered superior quality (and certainly not if you do compare H.264 to MJPEG at the same bitrate).

I now find that some people are assuming that high GPU utilization automatically means better video performance and that all you have to do is fire up GPU-Z and you’ll know if the decoder is using the GPU for decoding.

There are some that will capitalize on the collective ignorance of the layman and ignorant “professional”. I suppose there’s always a buck to be made doing that. And a large number of people that ought to know better are not going to help educate the masses, as it would effectively remove any (wrong) perception of the superiority of their offering.

Before we start with the wonkishness, let’s consider the following question: What are we trying to achieve? The way I see it, any user of a video surveillance system simply wants to be able to see their cameras, with the best possible utilization of the resources available. They are not really concerned if a system can hypothetically show 16 simultaneous 4K streams because a) they don’t have 4K cameras and b) they don’t have a screen big enough to show 16 x 4K feeds.

So, as an example, let’s assume that 16 cameras are shown on a 1080p screen. Each viewport (or pane) is going to use (1920/4) * (1080/4) pixels (at most), that’s around 130.000 pixels per camera.

A 1080p camera delivers 2.000.000 pixels, so 15 out of every 16 pixels are never actually shown. They are captured, compressed, sent across the network, decompressed, and then we throw away 93% of the pixels.

Does that make sense to you?

A better choice is to configure multiple profiles for the cameras and serve the profile that matches the client the best. So, if you have a 1080p camera, you might have 3 profiles; a 1080p@15fps, a 720p@8fps and a CIF@4fps. If you’re showing the camera in a tiny 480 by 270 pane, why would you send the 1080p stream, putting undue stress on the network as well as on the client CPU/GPU? Would it not be better to pick the CIF stream and switch to the other streams if the user picks a different layout?

In other words; a well-designed system will rarely need to decode more than the number of pixels available on the screen. Surely, there are exceptions, but 90% of all installations would never even need to discuss GPU utilization as a bog standard PC (or tablet) is more than capable of handling the load. We’re past the point where a cheap PC is the bottleneck. More often than not, it is the operator who is being overwhelmed with information.

Furthermore, heavily optimized applications often have odd quirks. I ran a small test pitting Quicksync against Cuvid; the standard Quicksync implementation simply refused to decode the feed, while Cuvid worked just fine. Then there’s the challenge of simply enabling Quicksync on a system with a discrete GPU and dealing with odd scalability issues.

GPU usage metrics

As a small test, I wrote the WPF equivalent of “hello, world”. There’s no video decoding going on, but since WPF uses the GPU to do compositing on the screen, you’d expect the GPU utilization to be visible in GPU-Z, and as you can see below, that is also the case:

The GPU load:

  • no app (baseline) : 3-7%
  • Letting it sit: 7-16%
  • Resizing the app: 20%

This app, that performs no video decoding what-so-ever, uses the GPU to draw a white background, some text, and a green box on the screen, so just running a baseline app will show a bit of GPU usage. Does that mean that the app has better video decoding performance than, say VLC?

If I wrote a terrible H.264 decoder in BASIC and embedded it in the WPF application, an ignorant observer might deduce that the junk WPF app I wrote was faster than VLC, because it had higher GPU utilization, whereas VLC did not.

As a curious side-note, VLC did not show any “Video Engine Load” in GPU-Z,  so I don’t think VLC uses Cuvid at all. To provide an example of Cuvid/OpenGL, I wrote a small test app that does use Cuvid. The Video Engine Load is at 3-4% for this 4CIF@30fps stream.

cuvid

It reminds me of arguments I had 8 years ago when people said that application X was better than application Y because X showed 16 cameras using only 12% CPU, while Y was at 100%. The problem with the argument was that Y was decoding and displaying 10x as many frames as X. Basically X was throwing away 9 out of 10 frames. It did so, because it couldn’t keep up, determined that it was skipping frames and instead switched to a keyframe-only mode.

Anyway, back to working on the worlds shittiest NVR….