Ed, man! !man ed

When I log into my Xenix system with my 110 baud teletype, both vi and Emacs are just too damn slow. They print useless messages like, ‘C-h for help’ and ‘“foo” File is read only’. So I use the editor that doesn’t waste my VALUABLE time.

Ed, man!  !man ed

via Ed, man! !man ed- GNU Project – Free Software Foundation (FSF).

When IBM, in its ever-present omnipotence, needed to base their “edlin” on a Unix standard, did they mimic vi? No. Emacs? Surely you jest. They chose the most karmic editor of all. The standard.

Ed is for those who can remember what they are working on. If you are an idiot, you should use Emacs. If you are an Emacs, you should not be vi. If you use ED, you are on THE PATH TO REDEMPTION. THE SO-CALLED “VISUAL” EDITORS HAVE BEEN PLACED HERE BY ED TO TEMPT THE FAITHLESS. DO NOT GIVE IN!!! THE MIGHTY ED HAS SPOKEN!!!

Level 3’s Selective Amnesia on Peering

Fortunately, Verizon and Netflix have found a way to avoid the congestion problems that Level 3 is creating by its refusal to find “alternative commercial terms.” We are working diligently on directly connecting Netflix content servers into Verizon’s network so that we both can keep the interests of our mutual customers paramount.

via Level 3’s Selective Amnesia on Peering | Verizon Public Policy.

Former FCC Commissioner: “We Should Be Ashamed Of Ourselves” For State of Broadband In The U.S.

He led off by agreeing with the several executive speakers that true competition is the way of the future, and the best way to serve consumers. “But we haven’t given competition the chance it needs,” he continued, before referring to how poorly U.S. broadband compares on the global stage. “We have fallen so far short that we should be ashamed of ourselves. We should be leading, and we’re not. We need to get serious about broadband, we need to get serious about competition, we need to get serious about our country.”

via Former FCC Commissioner: “We Should Be Ashamed Of Ourselves” For State of Broadband In The U.S. – Consumerist.

This Time, Get Global Trade Right

But the administration’s rationale for secrecy seems to apply only to the public. Big corporations are playing an active role in shaping the American position because they are on industry advisory committees to the United States trade representative, Michael Froman. By contrast, public interest groups have seats on only a handful of committees that negotiators do not consult closely.

via This Time, Get Global Trade Right – NYTimes.com.

The Comcast Merger Isn’t About Lines On A Map; It’s About Controlling The Delivery Of Information

The joy of being a vertically integrated company is being able to exercise something called vertical leverage. Basically, the bigger Comcast gets, the more extraordinary financial power they wield. The terms they can negotiate upstream and downstream are more likely to be favorable to them, and not to anyone else.

A report [PDF] from the Consumer Federation of America calls these “bottleneck points.” And the bigger Comcast gets, the more of them they have — as in their recent peering dispute with Netflix.

via The Comcast Merger Isn’t About Lines On A Map; It’s About Controlling The Delivery Of Information – Consumerist.

In the end, making Comcast bigger only gives it more leverage — a company that would control the lion’s share of to-the-home information for this country. Until such a time when (and if) wireless and fiber providers begin offering a service that competes with cable Internet on speed, availability and cost, consumers are only going to see the walls around Comcast’s sandbox grow taller, while bottlenecked Internet businesses face higher and higher tolls for access to a huge portion of American homes and offices.

Inside Major League Baseball’s “Hypothesis Machine”

Baseball data, over 95% of which has been created over the last five years, will continue to mount—leading MLB decision-makers to invest in more powerful analytics tools. While there are plenty of business intelligence and database options, teams are now looking to supercomputing—or at least, the spawn of HPC—to help them gain the competitive edge.

via Inside Major League Baseball’s “Hypothesis Machine”.

Please.  The problem with current baseball analytics isn’t the deluge of data, it’s the deluge of crackpot theories that add more and more irrelevant variables to the mix.  Most baseball analytics misuse mathematics and created by people who are simply selling a website.

Speaking of selling a website; is this a good place to introduce the sister site to bucktownbell.com?  🙂

baseball.brandylion.com

All data in above data model crunched using perl,awk, and bash on a standard PC.  Baseball is not that complicated where it requires a supercomputer to crunch historical or current season data.  More  from the article…

He explained that what teams, just like governments and drug development researchers, are looking for is a “hypothesis machine” that will allow them to integrate multiple, deep data wells and pose several questions against the same data.

What Michael Lewis Gets Wrong About High-Frequency Trading

The idea that retail investors are losing out to sophisticated speed traders is an old claim in the debate over HFT, and it’s pretty much been discredited. Speed traders aren’t competing against the ETrade guy, they’re competing with each other to fill the ETrade guy’s order. While Lewis does an admirable job in the book of burrowing into the ridiculously complicated system of how orders get routed, he misses badly by making this assumption.

via What Michael Lewis Gets Wrong About High-Frequency Trading – Businessweek.

Why, oh WHY, do those #?@! nutheads use vi?

Yes, there are definite reasons why the vi/vim editing model is just superior to any other out there. And you don’t need to be a Unix whiz to use it, either: vim is available for free for almost any platform out there, and there are plug-ins to get the functionality inside all major IDEs.

via Why, oh WHY, do those #?@! nutheads use vi?.

If you want to research vi/vim editing some more, here are some useful references:

And of course:

No the Internet is not a ‘value tree’

Projects like Wikipedia, uses such as text and data mining, online access to cultural heritage and educational resources, and transformative use of the Internet do not follow the same logic as the traditional content industry value chains. Here limited user rights and long terms of protection become problematic and increased enforcement translates into chilling effects.

At the same time all of these types of uses are exactly what makes the Internet special and drives its potential to accelerate innovation and to democratize access to knowledge, tools and culture. The Internet is the first mass medium that is simultaneously enabling market driven uses, uses that are driven by public policy objectives (such as education or access to culture), and uses driven by people’s desire to create, collaborate and contribute to the commons.

via Kennisland : No the Internet is not a ‘value tree’.

No, I Don’t Trust You! — One of the Most Alarming Internet Proposals I’ve Ever Seen

The technical details get very complicated very quickly, but what it all amounts to is simple enough. The proposal expects Internet users to provide “informed consent” that they “trust” intermediate sites (e.g. Verizon, AT&T, etc.) to decode their encrypted data, process it in some manner for “presumably” innocent purposes, re-encrypt it, then pass the re-encrypted data along to its original destination.

via Lauren Weinstein’s Blog: No, I Don’t Trust You! — One of the Most Alarming Internet Proposals I’ve Ever Seen.

In essence it’s a kind of sucker bait. Average users could easily believe they were “kinda sorta” doing traditional SSL but they really wouldn’t be, ’cause the ISP would have access to their unencrypted data in the clear. And as the proposal itself suggests, it would take significant knowledge for users to understand the ramifications of this — and most users won’t have that knowledge.

This editorial illustrates that Man In The Middle (MITM) attacks cannot happen without user consent.  This blogger fears that ISPs will require consent for all SSL sessions  making all users’ end to end encryption vulnerable to a “trusted” proxy.  Here is a blurb in the draft.

From the IETF draft:  Explicit Trusted Proxy in HTTP/2.0 draft-loreto-httpbis-trusted-proxy20-01

This document describes two alternative methods for an user-agent to automatically discover and for an user to provide consent for a Trusted Proxy to be securely involved when he or she is requesting an HTTP URI resource over HTTP2 with TLS. The consent is supposed to be per network access. The draft also describes the role of the Trusted Proxy in helping the user to fetch HTTP URIs resource when the user has provided consent to the Trusted Proxy to be involved.

The consent is supposed to be on a per network (or destination) basis which means there may be a reason the user agent will want to use a trusted proxy — perhaps they do not trust the destination network.  The blogger implies ISPs will want blanket consent over all destinations which 1) they could implement now without this standard and 2) this would not make for a good PR move because it would not go unnoticed.