Broadband

A group of people in clothing

Description automatically generated with low confidence

The sound of broadband from King George
Da da da dat da da da da ya
Da da da dat da ya da!
Da da da dat da dat da da da
Da ya da da da dat dat

Broadband Is Now a National and Community Necessity

A picture containing screenshot, line, night, light

Description automatically generated

Trigger warning: technical subject matter, unavoidable with broadband. We hope any confusion resulting will be relaxed by the clarity of our recommendations.

It is widely accepted now that broadband is as necessary as electricity, clean water, and roads. It connects us to the Internet, through home networks attached to wire-line last mile networks, or through mobile networks. We all use it. It has become our social glue. Working at home, education at home, remote doctor visits, video entertainment including games, social media, connecting families together, news access, buying things, Zoom conferences, getting questions answered on Google are a short list of what we gain from the Internet today, all made possible by broadband networks. Eighty-five percent of our country is now potentially served by cable television networks with broadband capabilities. However, speed demands on broadband have been growing exponentially and will continue growing exponentially, soon eclipsing the capacities of cable networks and current or future mobile networks. Everyone in the industry knows that fiber optic lines, with capacity exceeding cable networks by tens of thousands of times, is the broadband end game.

Unhappily, private carriers cannot afford to pass every home in American with new fiber optic wiring. Government subsidies will be required to fill the gap. State and local governments and carriers could seek such subsidies from the federal government, which has already offered nearly $80 billion to cover areas not covered by cable television. But we are recommending herein that a significant portion of such subsidies be provided by municipal governments. We recommend this unusual procedure for three reasons: (1) the federal government has been generally unwilling to provide all the assistance needed for infrastructure improvements, even when bridges and dams and roads are threatened with near-term collapse; (2) broadband has other problems that can only be remediated by local governments; bundling them together with funding makes sense; and (3) it will establish a sense of local ownership that we believe is sorely lacking in our country, whose reversal is needed to stem the tide of nonsense and danger afflicting our federal and many state governments and the two political parties that serve them. It is also an area where the people can make a real difference at local levels, but not at state or federal levels. We can hear the applause from Jefferson and Madison from here.

We offer herein the following sequence of topics: (1) what is broadband? (2) last mile networks; (3) projected penetration of residential fiber networks; (4) federal government attitudes and existing programs; (5) other problems with broadband; (6) what is missing? (7) what makes sense? (8) what we recommend; (9) what this site will contribute; (10) why broadband works so well as a model for other areas of our political enterprise; (11) epilogue—broadband programs as analogies for adjacent problem areas.

[NOTE: This section is longer, more technical, and more tutorial than others we provide to propose remediations for current problems. Some of the length and density arises from the coincidence that we know something of the field. But we believe the picture given here is representative of many areas in which our current system is either breaking down or falls too far short of meeting exigent demands. The Federalist Papers insist on adjusting means to suit ends, with experiments often needed to work our way around to new means that succeed. We have observed that ends are sometimes hard to specify, and that means can constrain ends, often around costs. We believe neither is the case here. Fiber optics is the established end game. Broadband to every home and business in America is necessary, as necessary as electricity and clean water. The actual costs are so much smaller than roads as to make costs the crucial issue risible. Yet we are not on that course. We are locked into old patterns around old problems, and bumping into worn out ideologies every time we turn around. The technologies have gone through staggering innovations; we have not. We are the barrier to actually realizing broadband as a necessity. Only we can make it right. Making it right will spill over into habits we can apply to other problems. So we go further into this subject in part to illustrate how we might want to start thinking about other serious difficulties.]

A picture containing text, font, screenshot

Description automatically generated

Veblen’s somewhat sarcastic epigram has probably been more true of mankind’s industrial development than its more traditional inverse, dating back to Plato. Did we really need telephones? Bell’s invention arose from his studies of sound transmitted to deaf people; he did not start out intending to transform the world of human communications. But his invention became a necessity. We now see the same story being told about fiber optics, a transmission medium developed in parallel by Bell Labs and Corning that is becoming a necessity for human communications around the world.

Necessary implies universal, affordable, reliable, and as future proof as current technologies permit. Unfortunately, our current broadband infrastructure does not meet this set of standards. More than 15 million homes in America do not have any access to broadband at data rates commensurate with common needs today; as many as 70 million homes in America do not have access to a future proof network using fiber optics that satisfies all applications today and tomorrow; and as many as 30 million homes in thinly populated areas will migrate from an adequate network today to an inadequate network in the next few years for want of a fiber connection, increasing the digital divide absent public subsidies. While the federal government and a few state governments are funneling money into areas with “unserved” homes, those not passed by a cable television network in effect, they appear generally oblivious to the more deeply rooted problems of broadband in America.

A Brief History

A group of people in a line

Description automatically generated with medium confidence

Broadband is the latest stage in the evolution of telecommunication services in America. Telecommunications began with Bell’s invention of the telephone (patent application in 1876) and the emergence of ATT as a monopoly carrier for most of the country, a monopoly carrier heavily regulated from 1921 to 1984. (Some will object that telecommunications began with the telegraph, but that starting point affects nothing said here.) To keep residential telephone service affordable, state regulatory agencies allowed (or required) ATT to shift large sums from profitable long-distance and business markets to offset losses in residential service, where the highest capital costs arose from twisted-pair copper wiring. This arrangement allowed ATT to meet the criteria above for a necessary service and still earn 10% profits year after year. ATT stock was the bedrock of retirement plans for much of this period. The key point here is that residential telephone service was always subsidized, from within ATT, but still subsidized. Universal residential broadband will be no different.

However, in 1984 the Justice Department broke ATT, then the largest corporation on the planet, into pieces. In 1996 a new Telecommunications Act exchanged universal service and regulated monopolies for investment and competition in telephone, cable television, and radio markets. The move was a gesture to neoliberalism, a concept that free market systems would be most efficient at satisfying general community needs, a concept embraced famously by Ronald Reagan and Margaret Thatcher. The outcome of the 1996 Act was soaring telephone rates, massive consolidations in all three markets, and monopolies in many markets for broadband as it evolved. “Broadband” was not named as such in the Act, but “advanced telecommunications services” suggested the need. However, the Act only suggested competition and investment as remedies for gaps in such services, a silly idea given that gaps arise from high infrastructure costs in thinly populated areas. Why invest? Where is the competition? Just as ATT required internal subsidies to provide affordable telephone services to all, broadband networks will require subsidies to provide affordable broadband services to all. How and why these subsidies should materialize will be the general subject of this essay.

What is broadband?

A map of the world

Description automatically generated with medium confidence

This question can be answered generally, but runs into trouble when asked to be specific. Broadband is a high-speed data communications system connecting electronic devices such as computers, tablets, smartphones, televisions, and things like remote control of thermostats to massive collections of computers called data centers through networks employing for the most part a protocol suite called the “Internet,” itself when taken broadly comprising a complicated arrangement of network and computer procedures that connect devices, route data traffic through the network, and manage page builds on screens. The “Internet” taken narrowly refers just to the transmission protocols; the “World Wide Web” manages the connection and page build protocols. The lower bound data rate for broadband is 1 megabit per second (mbps), or 1 million bits per second. As suggested by the picture above of fiber cables in the sea, broadband knows no borders.

The network part of broadband can be organized into three parts: (1) a wide-area network that spans the globe, itself comprising many interconnected private carrier networks built now almost exclusively with fiber optic lines interconnecting switches and routers; (2) a last-mile network connecting the wide-area network to homes, businesses, and other premises; and (3) a home network that distributes information from the termination of the last-mile network to various devices in the home or business. The last of these is now usually a wireless WiFi network. The arrangement is analogous to a road system, with state and national highways (the wide area network) connected to local roads (the last mile network) that in turn connect to garages and parking lots (the home network) with car speeds (data) and road capacity diminishing as they get closer to home, with intersections acting as routing mechanisms. A sense of the enormous wide-area network’s capacity is 870,000 miles of fiber optic undersea cables connecting continents together, with as many as 32 fiber strands per cable, each fiber strand capable of transmitting 18 terabits per second (18,000,000,000,000 bps), enough for 4,000,000 simultaneous HD video streams. It is the most complicated and massive system humans have ever built, and it almost never fails. We might say on a total system level, it never fails. Why it works so well is a story we tell at the end; it may have implications for some public policy issues in other fields. We note here that fiber’s maximum capacity is not yet known. Laboratory systema have reached 125 terabits per second on a single strand. The limitations come from the transceivers, not the wire. This is more than 100,000 times the speed one can possibly achieve over conventional copper lines and all mobile networks used in telecommunications today. This is the reason fiber is the future.

The network part of broadband is statistical, in the sense that lines are shared by several users, or hundreds of users, even millions of users in the wide-area network, meaning the bandwidth (data capacity) may be insufficient for all users on a particular link at some periods of the day. Many experience the whirling circle when Netflix movies are interrupted for this very reason, 9:00 at night the heaviest time of use for broadband in the United States. Some data transmitted during these intervals are lost. The wide-area network today is overbuilt, meaning incidences of dropped data occur infrequently. However, the last-mile and home networks are often not sufficient to carry all data for all users sharing a line. These networks are called “best effort” as a consequence. Networks provide protocols designed to ensure that data does get through somehow (usually by repeated attempts), but real-time signals like video streams just lose the data. Many Internet Service Providers buffer video signals at the last office before the last-mile network to compensate for such problems, but they are never perfect.

However, traffic loss is not the most serious problem with broadband networks. We said above that getting specific with broadband runs into trouble. The trouble is nominal data rate to the home or office. What is it for broadband? We measure data in bits per second (bps). A “bit” is a “binary digit,” one with just two states, 1 or 0. Combinations of bits represent various forms of information. A voice signal converted to digital form by itself requires around 8,000 bits per second (kbps). An HD video stream requires between 3 and 7 million bits per second (mbps). A 3-D video conference stream requires 100 mbps; an 8K video stream is close to that figure. An uncompressed video stream for Virtual Reality, likely the only way we are getting VR through remote resources, exceeds 1 billion bps (gbps), the so-called “gig” threshold. Uploading a terabyte file (which will be commonplace in not many years) in less than 20 minutes requires 10 gbps; at 20 mbps, a common maximum on today’s cable television networks for upstream data, the upload time is close to seven days.

In the early days of broadband (circa 2000), the maximum data rates were 6 mbps downstream, to the user, and 1 mbps upstream, from the user to the data center, all constrained by the last mile network and the modems used therein. Twenty years later, that figure was 1 gigabit per second in both directions with fiber optic lines and 500 mbps downstream, 30 mbps upstream for cable television networks. Today one can realize 10 gbps in both directions over fiber, with 25 gbps available soon with no changes to the cables. (As we noted above, the differences are not in the fiber optic wire but in the transceivers; the 18-terabit undersea cables use transceivers that are way too expensive for commercial deployment to individual users.). From the earliest days of access speeds to the Internet, from 2400 bps maximum in 1980 to 10,000,000,000 bps today, the compound growth rate has been 46% per year. To say these speeds will not be needed ignores the realities of broadband.

The chart above is typical of charts projecting future data rate requirements. The figures are average, not peaks. The specific figures are less important than the trend (most charts have different figures, some higher, some lower, but always with this shape). The trend is exponential. The trend has been exponential since the early days. There are two drivers for the growth. One is the increased appetite for new applications, driven in part by so many applications involving pictures and video content with increasing pixel densities, with concomitant growth in file sizes. The second is the growth in network capacity from fiber and advanced cable networks that stimulate new applications, a kind of “if it’s there it will be filled” phenomenon. We only got HD television because data speeds increased enough to send several at one time down a line. To realize these speeds we will need changes in both last mile and home networks. How these materialize and who pays for them will be a constant concern.

We note here that this condition, of demand for increasing data rates, did not and does not obtain for other network services. Electricity began at 120 volts at 60 Hz, and remains there. Telephone systems demanded 4 kHz of bandwidth per call, and remains there. Wires used for both 100 years ago are still just fine for today. Road widths have not changed because car and truck widths have not increased. Water pipes serving homes have remained at ¾ inch or 1 inch, with in-home pipes at ½ inch diameter. Sewer line laterals are 4 inches in diameter. Airplane demand for increased runway lengths have caused expensive upgrades to major airports, but the larger number of small airports have not been expanded. Analog television signals occupied 6 MHz of bandwidth, and would have stayed there but for the digital revolution. By comparison, digital versions of standard definition television require 1.5 mbps, HD requires from 3 to 7 mbps, 4K television requires around 30 mbps, and 8K television requires around 90 mbps (some advertise lower data rates, but the idea is the same—expanding the pixel density of screens expands the data rates required for transmitting content).

We make this observation to suggest the role habits of mind play in public policy. We will discuss the causes of our public failures in broadband below. But a significant problem is that our elected leaders and the bureaucracies that govern services such as broadband have been used to network services that do not change in their fundamental nature and needs. Compounding this problem is the circumstance that almost all telecommunication and television services have been provided by private, for-profit companies. It is true that the FCC played a significant role in the transition from analog to digital television, overseeing the standardization of compression protocols for one thing. But the FCC now seems blind to the chart above and its implications even as every document they publish on broadband begins with the claim of necessity.

Last Mile Networks

America was blessed when broadband began to have ubiquitous telephone networks and nearly ubiquitous cable television networks, each with copper-based cables that could simultaneously carry voice or video signals and high-speed data signals within the last-mile network. We got the Internet connected to most Americans by using existing wire with special modems, or transceivers. (An alert reader may recognize this as a form of internal subsidy, infrastructure paid for by other markets.) Telephone wiring ran out of capacity around 2010 and has largely disappeared from the market (less than 15% share now and declining rapidly). Cable networks have been able to keep up with speed demands with increasingly powerful modems, all owing the increase to Moore’s law, that transistor density on integrated circuits will double every two years (Moore wrote this law in a paper in 1965—it is still true). However, coaxial copper cables are reaching the limits of their capacity—a matter of physics. Cable companies are touting so-called DOCSIS 4.0 modems as the avenue to 10 gigabit speeds, but they are still a year or two away, they will be expensive, they will require some network engineering and reconfiguration, they will only be usable by selective customers as the lines are not capable of wholesale upgrades to DOCSIS 4.0, and that is the end. Meanwhile, fiber-optic last mile networks have passed 60 million American homes, about 45% of total homes. These networks are typically installed today with 10 gigabit symmetric speeds built in, even if the customer only pays for, say, 1 gigabit. In two to four years the likely default installed speed will be 25 gbps. Fiber networks are cheaper to install, way cheaper to maintain than coax networks (the latter parameter is 12 to 1 by estimates of the cable industry itself, which deploys many miles of fiber in its hybrid fiber/coax networks), much more reliable (no electronics in the weather), and can be upgraded almost indefinitely without upgrading the last mile fiber plant. Fiber is the network for the next century.

We should mention one other form of last mile network, those that are wireless. We have three general types: conventional 4G and 5G mobile networks, fixed terrestrial wireless, and satellite networks. They have the benefit of being much less expensive than wire-line networks to every home. But they each have an Achilles heel. They are seriously band-limited, meaning data rate limited. Wireless systems use frequency bands allocated by the FCC. The pipes are only so big. The promotional shout a few years ago about 5G reaching 10 gbps has died to a whimper as 5G struggles to get to 1 gbps. The SpaceX network using Low Earth Orbit (LEO) satellites offers 100 mbps downstream and 20 mbps upstream, apparently seldom realizes these speeds, and cannot easily increase these speeds after satellites are launched. (They also require a $500 roof-mounted antenna.) They may satisfy certain mobile applications and serve homes in the wilderness, but they are not the future. Geocentric satellite services, using satellites 23,000 miles above the earth, suffer more than 500 millisecond round-trip delay which kills a number of important applications, and the data rates resemble the pace of an army of snails. Fixed wireless systems suffer the same limitations of mobile networks—they are band-limited and hence data rate limited. The last two options also suffer losses in capacity with rain and foliage. They cannot touch fiber networks for capacity or reliability.

Projected Penetration of Residential Fiber Networks

A picture containing text, line, number, font

Description automatically generated

Source: RVA LLC

The first residential fiber networks were installed by Verizon in 2005. Verizon stopped going into new markets in 2011, a change in Verizon leadership shifting capital spending to mobile. The fiber build from other telephone companies was lack-luster until Covid. Covid exposed the limitations of cable networks as homes needed simultaneous access to schools, remote work, and medical services. The lack of capacity from cable companies often forced families to timeshare network use. Starting in 2021, all telephone companies have been installing fiber as fast as market supply allows (Verizon is an outlier in this respect, not gearing up as much as the others, but they started with the largest footprint). Various industry estimates circle the number 60 million homes as the number passed to this point (mid-2023). The new build capacity is in the order of 5 million homes passed a year (it might grow as the supply of cable itself grows).

However, there is a limit. No telephone company has promised to pass every home in its market footprint. As home densities thin out, the returns required to justify capital investment become marginal and then completely and irreversibly insufficient.

It is very hard to estimate the number in this condition. But the public notices from phone companies have been consistently around 70% anticipated coverage. This comes to 92 million homes (out of 132,000,000 total). Cable television companies claim to pass 85% of American homes, leaving around 20 million out. It is likely that the 70% of fiber homes from telephone companies falls within the same housing footprint covered by cable companies. Some cable companies have also moved to fiber upgrades; Altice (Optimum) is just about finished with passing all 4 million of its customers, although many of their networks are in muncipalities with sufficiently high housing densities that incumbent telephone companies will also convert or have already converted their networks to fiber. (The latter is the case in Fairfield County, CT, but not in Litchfield, CT, both served in part by Optimum.) However, Comcast and Charter, who together own 80% of the American cable television market, are clearly banking on at least another decade of their copper-based coaxial networks (both have spot fiber networks, but all under odd circumstances, such as state funding). They are banking on small fractions of the market actually needing 10 gbps speeds, and those would be willing to pay a premium for the pleasure. However, the present cable coax plant is generally hostile to these speeds without reengineering, rerouting cable, and installing new nodes to avoid amplifiers, not only expensive but time consuming. Comcast has been known to wait may months before filling orders that require new cabling. In a fiber network, new work orders can be filled in a day. Where would you like to live?

Who and how many are left out? We estimated above that 20 million homes are not passed by a cable network or a fiber network. The preponderance are in contiguous rural areas, often in unincorporated areas, but some percentage are in edges of smaller towns where housing thins out and large, old apartment complexes with inadequate internal wiring, even if a coax or fiber line passes the building. (The FCC estimate is smaller, but for our purpose that is not material. FCC broadband maps are constructed from carrier estimates and are notoriously inaccurate. In the last round of subsidy programs from the Infrastructure Investment and Jobs Act (2021) the first effort at mapping unserved homes by the FCC had to be adjusted upward after inputs from state broadband maps by more than 3 million homes. It is likely that the true number of unserved homes—impossible to realize—is over 20 million.). It would be nice if we could rely upon federal and state subsidies to cover all unserved homes over the next decade with fiber optic networks. Counting on that would be in the same doghouse of illusions we see from some public officials today. Somewhere between $70 and $80 billion has already been pledged by federal and select state governments to provide broadband to unserved homes (see next section). Unhappily, a large fraction of the money so far awarded has gone to or will go to wireless networks, kicking the can down the road. We will say some things in our program section of this paper about the residue, but we can say here that the costs of passing many of these homes with fiber will seem prohibitive. We say “seem” prohibitive. Compared to roads, fiber optic lines are cheap. We build roads through all of these rural communities, because car travel is necessary. We supply electricity to these homes, because electricity is necessary. Broadband is now necessary. Ergo?

If we are right, that current and planned fiber networks for the most part pass homes already passed by a cable network, the number of homes passed by a cable network but not likely to be passed by a fiber network from private funding alone is in the order of 20 million homes, 15% of the total. These homes will be our principal concern below.

Federal Government Attitudes and Existing Programs

This is not a national tragedy. But it should be considered a national problem, a serious one. Fifteen percent of American homes without a necessary utility would be totally unacceptable for electricity, clean water, or roads. It should be totally unacceptable for broadband.

Unfortunately, our federal government, and those state governments that have shown any interest, have bound themselves to FCC notions of “broadband” that made no sense when they produced their last definition in 2015, and makes far less sense today. To the FCC, ‘broadband” means 25 mbps down, 3 mbps up. Any network that can operate at these speeds or better is considered broadband. Any home passed by these networks is “served.” While this excludes older DSL systems using telephone lines and most geocentric satellite systems, it includes all cable television networks, LEO satellite systems, and 4G and 5G mobile networks. Yet none of these networks today can satisfy high data requirement largely around file transfer, and as we argued above, they are DOA for future applications except specialty cable networks using DOCSIS 4.0 modems, themselves only a stopgap before cable most convert to fiber.

The problem with using such measures for “broadband” compounds with the nature of all cable television networks and what the FCC allows them to say about data rates. Cable networks are heavily shared, sometimes more than 100 users on a single line. If the nominal or marketed rate of a line is 25 mbps down, 3 mbps up, the FCC considers it “broadband.’ But what each user gets during times of high percentage usage is a small fraction of that figure. Ten users would drop 25 mbps to 2.5 mbps, and hence incapable of receiving an HD video stream. Most cable networks today provide much higher nominal or market rates downstream, but they suffered greatly during Covid when several users per home needed access at the same time. Many families report having to time share the network, with those working from home taking one time slot, kids filing school reports taking another.

The 2021 Infrastructure Investment and Jobs Act (IIJA) included a $65 billion broadband component, of which $46 billion was devoted to new construction of broadband networks, of whcih $42 billion will go to states. The money is to go to “unserved” homes first, which will absorb the entire amount. The Act was technology neutral, but in an effort to spread its dollars to the most homes, each proposed project must survive competitive bidding within the state of the project, with a 25% local match, which condition strongly favors cable TV extensions and wireless networks. In a smart move, the National Telecommunications and Information Authority (NTIA) which administers the IIJA broadband program provided in its rules a condition that any fiber project would trump any other technology if there was a fiber project proposed regardless of relative costs. This is regulatory over-reach, but so far complaints from wireless, mobile, and cable companies have not prevailed.

We note that a similar form of over-reach by the EPA regarding coal-fired electricity plants was overturned by our current Supreme Court (2022 West Virginia v EPA), a terrible decision for the planet (and it could easily have gone the other way), but one not as egregious as other decisions of this court. The path from law to regulation is generally tricky. The IIJA was 2500 pages long and covered many forms of infrastructure. No elected representative understood the whole thing; most would not have a clue about the nature of the broadband market as outlined here. Their staffs would be hounded by carriers who not only have the authority of providing necessary services, but understand the underlying technologies much better than any staff person we have encountered. Cable company assurances that they will provide 10 gigabit symmetric services over their existing network would likely be believed, as it takes some understanding of the wires, the shared network architectures, and signaling technologies to blow the argument into little pieces to fall on the floor. Given the history of telecommunications outlined above, the reliance upon carriers has been built in for a century, the reason the FCC is largely hostage to them even with Democratic Presidents who have the opportunity to appoint a new FCC chairman (the FCC is otherwise an independent agency, one in which the chair or other commissioners can only be terminated for cause).

Viewed objectively, with the market as characterized here, investing a dollar in anything other than fiber optics is just delaying the inevitable (excepting of course those few homes that can only be served by wireless networks, mostly in Alaska). The NTIA recognizes this enough to be more specific about suitable technologies. They are right to do so. Whether it is legal, of course, is for a court to say. We learn this from Federalist 78 and 1803 Marbury v Madison. As we are learning now from the current court, what is “legal” means what conforms to a party ideology, at present a very conservative version of the Republican Party. This has always been so to some degree, although Earl Warren was a Republican who ushered into the Court record books more liberal and progressive decisions than any other Chief Justice except John Marshall.

Note as well that our federal government tends to underfund critical infrastructure projects. The IIJA authorized $1.2 trillion for diverse infrastructure needs. Given the mood in Congress at the moment, one in which Republicans appear ready to compromise critical services as the price of raising the debt ceiling (this written in May of 2023), the IIJA may be the last infrastructure investment for some time (our last Republican President promised $2 trillion but asked Congress for not a dollar—it took Biden to get as least a good portion of that sum in play). Estimates from engineering firms put the figure required for repair of existing infrastructure between $3 and $5 trillion. On the broadband front, the sums provided to date by the federal government are not sufficient to cover all unserved homes, by tens of billions of dollars. The areas left out are the most expensive to serve—farm areas with a home a mile of road, say. The subsidies in the finished and signed IIJA bill were half what the original House bill proposed, and that was insufficient. We say it again: our federal government does not consider broadband a necessity despite repeated claims to the contrary.

There are Other Problems

A picture containing graphics, font, clipart, graphic design

Description automatically generated

The broadband section of the IIJA recognizes serious problems with adoption, digital equity, digital literacy, and access for those on the economic margins relative to broadband. Of course almost everyone in America seems to have a smartphone and a subscription that gives them Internet access. But 75% of smartphone traffic goes through home or business wire-line networks, not mobile networks; areas with thin housing footprints have spotty mobile networks; and smartphones are poor substitutes for laptop computers for education, working at home, remote telemedicine, and video entertainment, not to mention current and future applications for the Internet of Things and Virtual Reality. Yet 50 million Americans do not subscribe—we use the word “adopt”—to home-based broadband services. For those who cannot afford it, and who are also on some other federal subsidy program, they may apply for the Affordable Connectivity Program administered by the FCC that carriers must support (they all do) which provides a lower data rate service for $30 per month that carriers can recover from the FCC. However, this program has proved to be awkward, hard to apply, and confusing. It is also not quite permanent; it absorbs $14 billion within the IIJA but that will run out eventually.

It also happens that many people in America do not know how to use the Internet. Most of them are either old or poor. While not quite as bad as not knowing how to read, digital illiteracy creates an increasing limitation on participation in American life, adding to our cultural divide as well as digital divide. Given that anyone under the age of, say, 20, has grown up with digital tools in their hands from infancy onward, this problem will likely disappear in time, but for quite a few decades it demands attention from social service agencies. Many libraries already have digital facilitators to help people with Internet access and use. But agencies we have contacted say much more is needed.

Home Networks

Among the problems that will emerge as broadband speed demands increase are home networks. There are not mentioned in the IIJA and are ignored generally in official regulatory agencies. But home networks are like parking lots, slow and awkward to navigate. Even today, if you have subscribed to a 500 mbps downstream rate from a cable or fiber optic provider, you may get no more than 200 mbps through your home network. Most home networks use WiFi routers for wireless connectivity. (For what it’s worth, “WiFi” is not an acronym for some title with words, although some claim it means “Wireless Fidelity”; it was created by a marketing firm to give an upbeat designation for IEEE 802.11 standards that cover small area wireless networks.) The original band for WiFi communications was around 2.4 GHz, the same range used by microwave ovens, many television clickers, garage openers, baby video monitors, and wireless home telephone systems. This band is not regulated by the FCC (it was originally opened for medical science investigations without requiring a license). Competition for bandwidth comes from many sources. Over the last few years WiFi routers have been able to add 5 GHz and 6 GHz bands, but these bands lose range as the square of the frequency differences, and only work if connected devices support the frequencies. Most new laptops, smartphones, and tablets support all three ranges today, but those one or two years or more old will not. We have also found that certain peripherals such as printers with WiFi access get hopelessly confused with new routers employing multiple frequency bands.

The technical answers to these current and future problems with home networks have been “live with it,” (meaning move near the router with your laptop), mesh routers, and internal home wiring to multiple routers. The first is, in the long term, unacceptable; there are too many instances of several people sharing the network to make close proximity to a router practicable. Mesh network routers may be placed around a home, each one talking to its neighbors to eventually route data to the last-mile network termination for connection to the Internet. Unfortunately, the way mesh routers have to talk to each other takes away half the bandwidth available to a user for each router added. You could start with a gigabit and end up with 125 mbps. The best and most secure solution is home wiring from the last-mile network termination to each room in the house that people will need Internet access and connecting WiFi routers thereto. Note that even this arrangement will not deliver 10 gbps if you have a 10 gbps fiber service. To realize the full speed you have to connect your device

directly to the terminating transceiver over an ethernet link. (A further problem is that each WiFi router with expanded capacity costs between $250 and above $1000, depending on features; those on the economic margins will find them too expensive. We each advance in networking we create advances in the digital and cultural divide.)

Rewiring a house is unappealing if it means opening and then repairing walls to install cables, and likely too expensive for many. However, as we will point out below, at least building codes should be updated so that all new buildings and all serious remodels incorporate CAT6 or better wiring for routers when walls are already open and finishes yet to be applied. Those working at home may be able to prevail upon their companies to fund a selective home rewiring. An important question will arise over time if this constitutes a renewal of the digital divide that deserves government attention. We will argue below that it does, but it is weaker than the argument that all homes should be passed by fiber.

Carrier Customer Service

A picture containing text, person, indoor, wall

Description automatically generated

Finally, we have to say something about carrier customer service. It is uniformly terrible. When ATT was a regulated monopoly, its service profile was exceptionally good. But ATT governed a network that did very little, even though it grew dramatically in complexity to just connect one phone to another and hold it until someone hung up. In the 1950’s ATT started to support data transmission over phone lines, at the blazing speed of 300 bits per second, enough for slow typing. (ATT Bell Labs invented the modem, or data transceiver, to perform the miracle.) Also, ATT built one of the most reliable instruments invented by man. Their biggest problem was telephone lines broken during storms, but even this was minimal because the telephone wire was always the lowest on the pole, the electrical wiring taking the brunt of tree branches falling. If you were an adult before the smartphone, you will remember that electricity would go out but your phone service remained operational. Why? Because ATT supplied 48 volt DC power down the phone line to operate the telephone rather than rely upon house power, and their lines seldom broke in a storm.

We are not attempting here to suggest a return to regulated monopolies for broadband. Some of the reasons for breaking up ATT come out of a zeal for competition in the abstract, an idea when applied to telecommunications with huge capital costs that makes very little sense, as we have seen in the aftermath of the breakup of ATT. However, it was also clear by the mid-1970s that technology innovations springing out of Silicon Valley and other places were moving past the capacities of Bell Labs and the ATT network itself. As telecommunications moved to data communications, with the Internet evolving, with local area networks evolving, with personal computers evolving, with software becoming a critical industry, the natural monopoly was no longer efficient. The modern dial-up modem, Ethernet local area networks, the personal computer, DSL, word processing, Facebook, and Google were born in or around Palo Alto, not in New Jersey where Bell Labs held forth.

We might tell a story to illustrate this phenomenon. The modern dial-up modem was invented by a small company in Palo Alto in 1972. It promised 1200 bps in both directions, a four times improvement over the then-current standard of 300 bps, and in contrast to a 1970 Bell Labs Technical Journal article arguing that 600 bps was as fast as modems could go on the dial-up network. The basic architecture of the new modem enabled advances in speed to 56 kbps as Moore’s law improved signal processing speeds. The early Internet is hard to imagine without it. Bell Labs took note and developed its own version, but made some mistakes. They choose carrier frequencies (for easier digital implementation) that compromised performance, adopted a flawed loop-back testing mechanism that caused the modems to freeze periodically, and switched the send and receive carriers that precluded an important application at the time, acoustic coupling through a normal handset. This matter was brought before Judge Greene, the trial judge in the 1981 Justice Department case against ATT. He was sufficiently interested that he asked the witness to give him a tutorial in modems the first afternoon of his testimony. The witness did so after paper charts and pens were provided. After the witness came through cross-examination without damage, Judge Greene was heard to say “this was very important day” in the trial. He ordered the break-up of ATT a year-and-a-half later. David and Goliath? Probably not. But the episode speaks to the sea change from the large, immobile companies like ATT and IBM to the more nimble, quicker pace of small entrepreneurial companies.

However, when telecommunications companies that emerged after the 1996 Telecommunications Act finally settled into a variety of local monopolies, they all decided to treat customer service as a cost center, not a product, or a portion of a product. This was true for telephone companies as well as cable television companies. ATT, Verizon, Comcast, and Charter all collapsed their service groups into back-office organizations, created product lines with ever increasing complexity and confusion, marketed products with varying levels of hyperbole that often created false expectations (5G is the crown prince of this group, with 10 gigabit service from Comcast in the running), and attempted to automate service and transfer human interaction to people in foreign countries with heavy accents to save money, who only got on the line after fifteen minutes of fighting through robotic voices who could not talk to you.

It must be said that this also now applies to airlines, hotels chains, Amazon, Microsoft, Apple, Google, and many other enterprises. We can blame Mike Markkula for this transition in a way. He was the second president of Apple Computer, famous for his battles with Steve Jobs over product strategy. He fostered the idea of user integrated products, users buying a computer from Apple, peripherals and software from others, and putting the whole thing together themselves. While critical to the growth of these markets, the practice created a mess for customer service; whose product was at fault for a problem in a high technology world which few customers understood. We use broadband under this very principle, and problem. No one has purview of the full experience, and problems may arise from one part that seems to be from another. This is particularly relevant to problems that arise with home networks that users attribute to problems with the last mile supplier. Last mile carriers complain that 90% of service calls arise from problems other than ones the carrier can solve.

Only if you live on Mars have you not been subjected to this problem. It is too hardwired now into how these companies operate to expect any serious change or return to real customer service for large corporations. However, we have some thoughts below about municipal or region approaches that would make the customer experience much better without changing the carriers.

What is Missing?

The FCC and the Department of Agriculture also have subsidy programs for “unserved” homes, now in the billions of dollars. These dollars generally go directly to carriers rather than to the states. Some reasonable proportion of these dollars from recent grant programs have gone to fiber optic networks, but wireless networks have also received significant sums. What is missing from all of these programs is a sense of the whole broadband experience and the necessity of subsidies so many homes theoretically “served” today that will not be “served” tomorrow. What is missing is taking broadband as a necessity.

In our view what is also missing, critically, is municipal interest and leadership. The principal sources of political energy and control over electricity, water, roads, indoor plumbing, schools, storm drains, building codes, public transit, airports, and other necessities come from municipal governments, not state or federal governments. Any home built today will face a high stack of codes controlling what can and cannot be done about many necessary features of any home. But as far as we know, no municipality has troubled itself with requirements of home networks and wiring required for broadband. More importantly, few municipalities have troubled themselves with filling the gap between what carriers will provide and the total demand within a town, or looked ahead at the future of broadband demands and begun the calculus for what must be done to assure universal, affordable, reliable, and future proof broadband, now a necessity.

We can attribute this condition in part to our historical reliance upon private carriers to provide telecommunications without significant municipal involvement beyond rules about street cabinets and utility poles in public rights of way, as well as the historical reliance upon state and federal agencies to regulate private carriers when regulations seemed necessary. We can also attribute the highly technical nature of the conversation about broadband that leaves most in confusion. If you understand the technical part of what is said here you are ahead of most. But the subject cannot be discussed without these technical references.

What Makes Sense?

At least one fiber optic broadband network should pass every home in America as soon as possible (excepting the few outliers that are really too far away from a central office, mostly in Alaska). This will not be accomplished by private carriers without financial support to cover areas with housing footprints too thin to justify infrastructure costs. There are several ways, in theory, to accomplish this.

1. The federal government could buy up all existing private fiber trunk lines passing homes, fill in all the gaps, and offer connections to multiple private providers for drop wire (connecting the wire on the pole to the home), in-home electronics, and ISP services, creating real competition in a market that is presently a monopoly in 50% of the country. This is comparable to how we treat roads, municipal electric power systems, public water supply—public ownership and maintenance of the distribution system, private ownership of connections to and equipment within homes, the latter usually constructed through competitive bidding processes. The FCC estimates costs for such an adventure would run around $400 billion; yearly maintenance would be in the $320,000,000 range. If the capital costs were bonded over twenty years, the costs per year would be around $27 billion, or 0.5% of the federal budget (or $84 per American home per year, or $7 per month). The fiber itself lasts at least 40 years, so after the first twenty years, the costs per year will only be for maintenance at less than $1 per home per year for the next twenty years.

This makes way too much sense to even consider. Think of it as the modern counterpart to Eisenhower’s construction of the national highway system, a critical contribution to economic growth, easy citizen movement, and quality of life that all could use without discrimination. Our current federal government will never do it. Acute hostility from our major carriers—Verizon, ATT, Comcast, Charter—will still the hearts of those in Congress and the FCC. It’s socialism, after all, the death of America as we know it.

2. The federal government could compel telephone companies that are now passing homes with fiber that fall within their sense of suitable returns to do the rest within their territories, with the federal government paying the difference between what the private carrier is willing to spend and the true costs. Wireline telephone service blankets the country even if only 40% now subscribe, the balance using mobile phones for basic telephone service. The telephone companies would own the wire and provide maintenance. Some arrangement would have to be made for the time down the road when the wire will have to be replaced, but that will average in the 50-year time frame. Pricing such an arrangement accurately is almost impossible. However, $50 billion is likely high. These are homes already passed by cable television networks, meaning they are not distant rural communities with some homes a mile apart, but ones with thick enough housing footprints to justify cable passing.

This assumes that, between federal and state subsidies within existing and extensions of existing programs for unserved homes, all such homes will be passed by fiber over the next decade. Estimates for this work vary a great deal. The sums thus far committed, in the $80 billion range including the IIJA funding, could leave as many as 5 million “unserved” homes in the unserved state. Note that these subsidies are for an entire system, including drop wire and home electronics. Drop wire in rural areas can be very expensive, as they require long underground runs or new utility poles constructed on private property. Note as well that these subsidies have paid for inferior wireless solutions that will have to be upgraded to direct fiber sometime.

This program also makes sense. It will be vigorously opposed by Comcast and Charter because it creates competition where there was none, while putting the cable companies at a cost disadvantage because the government subsidized the capital expenditures. (But remember that cable companies are using wire that was paid for by other markets). It also assumes that the second part, subsidies for all homes currently unserved so they too have fiber optic connections, at a cost much greater than $50 billion. It also requires the FCC to get over the cable view that most people do not need such high speeds, will never need such high speeds, and cable networks will offer 10 gbps symmetric capability on a selective basis that will be sufficient to the narrow need such speeds represent.

Could this happen? We think not. It would take a sophisticated public to understand the need to give carriers billions of dollars to do what many would say the carriers should do on their own. It would take a major reversal in FCC approaches to carrier relations, and it would require Congress and the executive office to actually understand what they were doing.

There is also a reason for the public to resist these two options—neither address the full suite of problems, with adoption, digital literacy, a comprehensive subsidy program for those unable to pay, home networks, and carrier customer service. So we move to a third alternative, the one we recommend.

What We Recommend—Local Control

Simply put, we recommend the second alternative above, but with the subsidies an arrangement between municipal governments and carriers, with money from municipal governments. (We actually favor the first alternative above combined with municipal programs for equity and home networks, but it is so unlikely to be realized that we must leave it alone.). We further recommend the establishment of a permanent local committee, preferably with official municipal status, that blends volunteer with paid staff for large municipalities, may survive with volunteer staff for small communities. This committee will be responsible for: forging an agreement with a local carrier to ensure universal coverage within their municipality; monitoring and assisting where necessary with the construction, maintenance, and marketing of that network; organizing information for residences and businesses relative to last-mile services and facts about home networking; as well as opportunities for subsidies for those on the economic margins; plus video streaming, games, virtual reality, and other possible advantages of high speed networks; creating and implementing digital literacy programs through various venues in the municipality; upgrading building codes to include home wiring for low voltage communications; and overseeing a service triage operation that will radically improve customer service. These are the necessary support services for a necessary utility.

We must say first that municipal hostility to this idea has been high, even insuperable in most cases. There are hundreds of municipal fiber networks in America today, some private, some public, all but two we know of subsidized by federal or state funds if subsidized at all. The two we know of are Highland, Ill and Islesboro, Maine, the latter an island with 500 people. Both used municipal tax dollars to create a fiber network. Most municipalities have worked overtime to avoid committing local resources for broadband. Some municipalities in western Massachusetts have constructed municipal fiber utilities with about half the costs provided by the state, but they used revenue bonds rather than general obligation bonds for the rest, paying them off through user fees. These towns were without cable television service and suffered from old Verizon telephone service, so they get 85% and 90% take rates even though the monthly user costs are at or greater than common costs for cable television broadband. This model unfortunately compromises services to those on the economic margins and privileges the rich. A general tax is progressive; debt retired though fixed user fees is regressive.

There is also a visceral aversion to providing money to incumbent carriers. Their poor service grates on everyone; why should the community be asked to subsidize them when the relationships start so badly? There are three good reasons. One is that an incumbent carrier can construct a fiber network at roughly half the costs of a new company attempting the same thing. The largest saving comes from their ability to wrap new fiber optic wiring around their existing copper cables rather than attach the wire to poles directly. This alone saves about 25% of construction costs around “make ready,” the process of moving existing wires around to make ready for a new one. Incumbent carriers already have one or more central offices in place along with “back-haul” connections, upstream fiber links to the wide area network. If their existing wires are underground, the conduit in which they reside may be reused rather than built anew, a very expensive process. They already have maintenance trucks and maintenance facilities. They already have all necessary easements for using poles or other facilities on private properties, a common condition in quasi-rural communities. They already have administration facilities in place.

Our short message: get over it. The sums are so small relative to other expensive utilities that we accept as necessary—just think of road repairs—that, after the first small tax increase, no one will notice.

The second reason is related, depending upon circumstances. There are three models for fiber networks within a municipality relative to legacy carrier coverage: (1) full coverage from carrier for which they pay for construction entirely; (2) partial coverage from carrier, the balance requiring subsidies; (3) no coverage from carrier. If the first, municipal governments can worry about adoption, digital equity, and home networks, and forget the last-mile network part except the service component we discuss below. If the second, the municipality should find a way to cover the gap with public funding; we are proposing here that such funding be from the municipality. The third circumstance very likely arises because almost all homes are too far apart. Such municipalities very likely do not have cable television services, hence will qualify someday if not already for federal or state subsidies to create a network anew.

(It is the case that municipalities and unincorporated areas with very very thin housing footprints and no cable television network will find new network costs to be high, seemingly out of reach for local funding and unattractive for federal funding. The answer some unincorporated areas have come to is high user fees, say, $200 per month. One network in Minnesota took this approach some years ago because farming has become critically dependent upon high-speed data communication networks. We repeat: compared to roads, electric grids, airports, water distribution systems, fiber networks are cheap. Wire on existing polesd can be attached for $30,000 to $50,000 a mile; road repair can run $1,000,000 a mile if not done frequently enough, and must be repeated. Fiber if forever, relatively speaking.)

The first model is less common than one might imagine. As carriers have no regulatory requirement to cover everyone for Internet (as is the case within cable television franchises), they make fiscal judgements about who to pass and who not to pass. Before Covid many fiber network providers would wire up a few blocks with good housing densities and see how many subscriptions they could earn; if enough, they would move on. This kind of market testing has been common. It obviously stops at blocks with not enough houses to justify the trunk wiring. In any event, each municipality should take it upon itself to understand the percentage of homes covered.

We are going to assume hereafter that the third circumstance named above will either be taken care of over time by federal or state subsidies, as these are the truly “unserved” communities., or will be forced to accept an expensive relationship with an existing carrier, still likely to be a telephone company that has already supplied twisted pair wiring everywhere. We remain convinced that accepting a wireless solution because the costs per home is so much lower is a mistake; wireless networks cannot support data rates that will feel mandatory in ten years.

The third reason is also related. It is likely that many of these areas that go unserved with a new fiber network from a legacy carrier will be spotty, small sections of towns, rather than any area large enough to justify a new network from anyone else. The carrier building adjacent networks on their own may be the only way to actually get fiber optic broadband to the rest.

Among the reasons for tackling broadband deficits at the local, municipal level are:

  1. Without a significant change of heart in Washington, homes now passed by a cable television network will not be eligible for federal or state subsidies for a fiber overbuild. Washington does not change its heart very often, and the pressure on spending is not upward, but downward.
  2. Telephone company records are often wobbly. One should never trust their assessment of homes not passed or to be passed on their own initiative or money. A municipal committee armed with town tax records should make an independent assessment of coverage so the statement of needed coverage is accurate. In larger cities, older apartment house]s should be examined. In smaller towns, roads with few houses, usually on the periphery of towns, should be examined.
  3. This will be a public/private partnership with duties and benefits to both sides. No two telephone companies have the same set of rules, desires, and intentions. As this kind of partnership is rare now (we know of few examples), partnership contracts are likely to vary from place to place and carrier to carrier. These cannot be negotiated from Washington, and the last thing anyone should want is a set of FCC rules dictating terms. Note in particular that developing the service support system we recommend below will take different forms in different places, making a common form from Washington improbable and likely undesirable. Note that certain features of a contract will apply to all connections, not just those subsidized by the municipality, the negotiation of which will also be telephone carrier and town specific. One is unlikely to get price caps, but competition from cable companies will likely keep prices in line. But one can imagine universal Service Level Agreements (SLA) such as those related to mean time to repair, minimum actual delivered data rates, business standard SLAs, and cooperation with the service triage system we propose below.
  4. As one may judge just by this recitation, broadband has technical components that are not commonplace in political discourse. A local committee with at least one member with some grasp of the technology will enable clear communications between carriers and municipalities which can be otherwise confusing and often unproductive. Not everyone has to know what OLT, ONT, SLA, back-haul, dark fiber, lit fiber, spectrum, VOIP, dense wavelength division multiplexing, dynamic range, splicing, packet interleaving, HFC network, and XGS-PON mean. (The last two name the basic architectures of cable television networks and a common form of fiber optic networks.) But many of these words will show up in contracts and discussions with providers as if they were commonly understood.
  5. Along the same line, a local committee will improve the knowledge of those on the committee and increase their capacities to communicate with others the nature and reasons for conducting the committees business in certain ways and understand the ways of private Internet Service Providers. This is not a fight won in a day with a single battle. This is an ongoing provision of a necessary community utility that is central to life, and will become more central to life as things like Virtual Reality, the Internet of Things, and holographic teleconferencing for medical visits and data collection become commonplace. It cannot be administered from Washington. It has to be administered locally, like most social services.
  6. Many municipalities today suffer deficits in mobile phone coverage.  The best way to fill out mobile gaps with 5G or higher (in the future) services is through small cell antennas attached to existing poles, now poles, or other structures, each connected back to a Core Network through fiber optic lines.  If the fiber lines are there and under some form of community contract, the community (if owner) or the private party (if owner) under such a contract may use dark fiber within existing cables for connections.  The fiber lines are usually the most expensive part of a network fashioned with small cell antennas.  (Small cell antennas have sparked a revolt by those worried about radiation.  These worries are baseless.  The source of highest power levels with cell phone communications is the smartphone itself, not the signal received by the smartphone from a distant antenna.  The power transmitted by a smartphone is inversely proportional to the power of the received signal.  The closer one is to an antenna, the lower the transmitted power from the phone that enters the body.  There is also no epidemiological evidence that smartphone radiation is harmful.  We cannot test human beings for their sensitivities to radiation, experiments with rats and cells in petri dished have been either inconclusive or negative at levels of energy actually absorbed through smartphone use, and many studies showing some have not be replicated.)
  7. Both electric utilities and departments of transportation around the country have been installing fiber optic lines for grid monitoring and management and road monitoring and management.  These would be installed much faster if these uilities could use exising fiber optic wiring.
  8. There are two general reasons for building an administrative resource for managing broadband network expansion and maintenance that is local. One is to participate in what we see as a necessary rebalancing of sovereignty in America, with more power at local levels than before. This can only be accomplished if local authorities take up the challenge and meet it. Of course, among the challenges is that local authorities will have to spend more money, a painful duty. Taxpayers seem to believe that local taxes are more personal and offensive than federal taxes, even when federal taxes are paying for the same things. But the actual administrative costs will not be high, and to the extent that a better network produces better values in the community, they will be an investment with positive returns.

The second is the building of social capital. In small communities the team may be entirely composed of volunteers; but even in large cities we advise a combination of paid staff and volunteer assistance. The social services side of broadband should be an attractive magnet for people wanting to help with limited time commitments. There are collateral benefits from communal community services, in friendships, knowledge gained about government administration generally, and a sense of the other, of diversity, that has been shown many times to be beneficial palliative to the all too common sense of isolation and self-indulgence that permeates our current social order.

Service Triage

Carrier customer service is terrible. We try to explain above both why it is terrible and likely to stay that way. What we failed to say is that not all parts of carrier customer service are terrible. If you can get past the numbing minutes (seems like hours) of trying to get a human being on the phone and then get him or her to finally understand what you want and connect you to someone with the knowledge and power to actually do something, you will find a highly skilled and responsive set of technicians and an unbelievably nimble system of network management facilities that can test and discover faults from a distance, sometimes measured in thousands of miles. As noted above, the reason you cannot get to these people immediately is that customer calls very often arise from problems that exist outside the scope of the carrier, were caused by the customer, or arise more from confusions than something actually wrong. (Even in the old days service departments would often complain that their call level would decrease by a factor of four if customers just read the manual. It is the analog of patients not following doctors orders, a contagion in health care. Read Federalist 37.)

These often-called “Tier 2 Support Teams” can be reached directly if you have their phone number. Distribution of these numbers is usually restricted to field technicians of the company and adjacent market field technicians who are known to the inside technicians. If someone installing a new home WiFi network needs the cable company router to switch into the bypass mode, he makes a call, says who he is (they may already recognize his voice), and the deed is done within 60 seconds. While we have not tested the idea proposed below with any incumbent carrier, we believe it would be readily accepted.

Imagine a call center, smaller than 911 call centers but similarly equipped, designed at the beginning only for business hour calls on the weekdays, 8 to 5 on weekends, in which service calls would be answered by a human being. This would be the first surprise. Suppose this person had on his or her screen the home and record of the caller, supplied by a data base supplied by the local fiber carrier(s), correlated to phone numbers, supplied by the user on subscribing to the service. The screen would show the location of the ONT (modem terminating the fiber optic line) and the layout of the WiFi system and the make of the WiFi routers, with Tier 2 service numbers for each.

In an enhanced view of this service, the user would have installed a surveillance box between the ONT and the first (usually master) router in the network, or between the ONT and the ethernet switch used to route data to and from an in-home wiring system. This box would monitor traffic, keep records of traffic, know the web addresses of all connected devices, and be equipped with network management testing facilities for both upstream (to and from the home WiFi network) and downstream (into the last mile network). An alternative would be to rely upon current and future cloud-based monitoring systems that can spot problems but are less adept at segregating home network from last mile network contributions.

In many cases the concern of the user will have been identified before he or she calls. If not, tests can be made to decide if there is a problem and whether the problem is on the home network side or the last mile network side. Often the user will be asked to turn off and then turn back on a device in the home network, a common solution with today’s electronics. If this fails or is obviously not going to work, the triage representative makes a call to the Tier 2 service agent of the responsible side of the network, communicates to whoever answers the call what he or she thinks is happening, and lets the conversation proceed between the service technician and the user.

Of course, you are asking yourself two questions: (1) who pays for it, and (2) how to organize it. As this will be new for any community, we would suggest municipal funding for the first year or two, until the local ISP (telephone company) has enough take rate to ascertain the level of staffing and tools needed to make it work. As take rates approach 50% (which this system will accelerate), costs may shift to users, in part or in whole, depending upon the actual sums involved. It will be obvious that towns smaller than 50,000 people will require a regional service triage center, meaning around 95% of American municipalities. Counties in most states may become useful sponsoring agencies. (911 call centers have similar range requirements.) We would recommend tapping into 911 call centers for help in establishing the early versions.

One benefit of such a program is that it would force Charter and Comcast to accelerate fiber installations in their network and buy into the same program. They will grump but in the end it creates the best broadband environment, one with real competition in most markets (those “unserved” now are unlikely to ever see two networks installed). The two cable companies are relying now on two factors that will preserve their market base: (1) most people will not need speeds available from fiber but not available from DOCSIS modems, making fiber and coax networks seemingly comparable; and (2) most people will not switch unless the cost and benefit improvements are significant. Verizon has yet to reach 50% market share in areas where they installed their FIOS system, which included a competitive television package; people hung onto cable for these two reasons. Among the reasons is that cable companies tend to drop prices by 20% for two-year contracts when a fiber network appears. (We have no immediate knowledge of what effect this may have had on Verizon’s decision in 2011 to suspend fiber deployment in new market areas, leaving FIOS to be largely a northeast event.)

What This Site Will Contribute

As this is a completely untested idea, this site will try to solicit, online, some early adoptors who see the benefits and have the political capacity to see and instrument the value. We have set up a Google space at FederalistPapersProject.gmail. You may enquire there about how to participate and what we might be doing.

Why Global Telecommunications Including the Internet Works So Well

The global telecommunications system, embracing telephone, video, and Internet traffic, connects a large percentage of the world together—five billion people would not be an outrageous guess, and it could be higher. After all, over 16 billion smart phones have been sold, two for each person on earth. Villages in Africa without water or electricity have forged local Internet hot spots with hand-generated electricity that enable them to communicate with smartphones and buy and sell things. There “system” constitutes a form of broadband. From a network point of view, they ask the same things as laptops and tablets, without the extreme demands imposed by huge file uploads. It is hard to think of a rival to global telecommunications for complexity, reliability, sweep, and necessity.

Yet it is almost entirely created and managed by a large number of private companies competing with each other but supported—“guided” might be a better word—by a large number of private standards organizations. We are not going to discuss the evolution of this system from a model from which it emerged, one of nationalized (government owned) telephone companies in all developed countries except the United States, which still had a private monopoly carrier with regulated pricing, which company created most of the early technologies and innovations used everywhere else, from which ideas emerged Silicon Valley and its U.S. descendants that produced most of the next generation of innovations, but which innovations are now springing from other countries, notably Finland and China.

A full explanation is beyond the limits of this web site, and is almost certainly beyond human comprehension if “full” is taken seriously. Read Federalists 37 and 10. It involves too many moving parts, too many unconnected initiatives that merged into an ever-evolving machine far beyond the complexity anything else the human mind has imagined, except perhaps the human mind itself, which mind is never accessible to the mind itself. So we are going to touch on some of the important parts.

Let us note before starting that a critical part of the success of this suite of organizations is the acceptance by its participants of submission to community standards. It is a system without overall governance. No one body controls it all. No one body controls any of its pieces. It is fit together by myriad things and protocols that must interact with each other with no central management. Things can be added or subtracted without disturbing the rest. Failures are always localized and generally trigger automatic workarounds. It might be called republican with a small “r.”

Standards Bodies

The key to the entire system’s success is a massive body of standards groups that all work the same way (we know of no exceptions). Standards bodies hold periodic meetings of engineers and other technically sophisticated people to hammer out standard protocols and equipment specifications on one of the many components of the system. Voting is by consensus —all participants have a veto power. As the participants for the most part represent competing companies, the system forces compromises that often conclude with something other than the best answer but one that is adequate and acceptable. Membership generally implies the willingness to license patented products or procedures at reasonable rates, retaining a competitive marketplace. The point is to get experts in the same room to hammer out a successful if not the optimum answer with enough textual description at the end to enable anyone to fulfill its demands for interoperation with equipment and systems made by others.

The mother of these groups is the International Standards Organization (ISO), a nonprofit corporation headquartered in Geneva. However, the top of this group, to the extent there is a hierarchy, is the International Telecommunications Union (ITU), an agency of the United Nations (that grew from a private telegraph standards group formed in 1865, itself an outgrowth of a group formed decades earlier to develop standards for semaphore signaling so armies meeting accidently could communicate with one another, that arising from an eighteenth century idea that competing armies representing competing countries would produce enduring peace, an idea that seemed to work until 1914—nothing is forever—see the end of Federalist 14). The ITU approves and catalogues most broadband standards, but most are developed in other private standards groups. The Internet Engineering Task Force builds Internet standards. The IEEE builds local area network transmission standards. ANSI builds modem standards. The 3GPP combines groups into one focused on mobile network protocols. The Broadband Forum builds protocol suites for Local Area Networks above transmission.

The ISO Stack

So, how do all these things fit together into an integrated system without a roadmap, a general architecture? The answer is systemic fragmentation of the parts with minimal specifications about how they interconnect. Among the most important organizational mechanisms standardized for broadband writ large is the ISO stack created by ISO in the late 1970s. It defines seven layers of a broadband network in such a way that any one layer feeds the layer above, may or may not know about the lawyer below, but each layer may develop independently of the others. The table below defines the layers.

The physical layer is misnamed—it has little to do with the physical means of transmission—copper, fiber, spectrum in the air—but more to do with transceiver protocols, how to get bits from one end of any medium to the other end with maximum speed and efficiency. The Data Link layer begins the process of grouping bits into frames or packets, with send and receive addresses. Media Access Control (MAC) addresses fit here, a twelve hexadecimal number used as physical addresses for any device connected to a local network. While there is some switching involved, or possible, Level Two protocols tend to be local. The Network layer, where we have the Internet Protocol (IP) and IP addresses, encapsulates Layer 2 packets into IP packets for transmission through the global network, again using send and receive addresses. How each node in such a network knows where to send the packet next depends upon massive maps contained in every network switch; the path is not known ahead of time or established ahead of time; some packets may arrive at its destination ahead of one that was ahead at the beginning, the reason they are numbered. Such is the fun of packet networking.

The transport layer deals with managing the movement through the network and error control; the TCP protocol recognizes errors and requires retransmission for broken packets. The IP/TCP marriage was commonplace in the early days, but speed demands combined with the low error rates of fiber lines and improved forward error correction protocols have allowed networks to largely drop TCP, which slows things down, pushing error control to Level 7 or outside the network itself, leaving it to terminal hardware for things like financial transactions. The layers above 4 have nothing to do with networking.

The power and importance of the ISO stack would be hard to exaggerate. IP4 was adopted in 1980; it is still the dominant routing protocol for the Internet. But its address space has been all but exhausted. The replacement protocol, IP6, has enough address space to name all atoms in the universe, but is not backward compatible with IP4; it has been in use for about two decades, but it provoked the Network Address Translation (NAT) dealing with matching IP4 with IP6 when necessary and reusing IP4 addresses in local area networks. Meanwhile, best Internet data rates have grown from 2400 bps (1980) to 56,000 bps (1996) to 6 mbps (2000) to 10 gbps (now) without anyone in the transmission business carrying a wit about Layers 2 through 7 or what the data was used for, over copper, fiber, and wireless media, with wireless media moving from 2G to 5G without caring about the Internet or wire-line modems or local areas networks. Meanwhile, Layer 2 protocols moved from two very simple local area networks (LAN), one from IBM, one from 3COM in the early 1980s, to IEEE standardization around the 3COM protocol, called Ethernet, to switched Ethernet, to mesh LANs, to gigabit LANs, without giving a damn about the underlying Layer 1 protocols or the Layer 3 networking protocols. Meanwhile, companies that worry about compression for video signals, or encryption of data for security, have no need or appetite for grasping transmission technologies or protocols. Meanwhile, those who worry about the World Wide Web live happily in Level 7 without sleepless night wondering what is happening below. (Of course, this is never perfect, but the dividing line has been maintained for so long that most developers have absorbed the culture that one does not rely upon anomalies or hoped for changes in an adjacent layer to solve problems in developing new protocols in another layer. But we can report one violation that proved very important in the early going that the industry absorbed rather than forcing strict conformity. The modern dial-up modem incorporates a level 2 feature of necessity—it has to understand the beginning and end of bytes, not bits, to make successful transitions between synchronous and asynchronous signaling—you don’t have to understand what this means. What evolved over time was the industry standardizing on one bit combination for a byte rather than the four in use in the early 1970s. But these instances are few and, as far as we know, never seriously disruptive.)

Network Management

How does the service agent you finally reached in Poland know the configuration, actual node to node data rates, model numbers, and owner of your local area network? How do IT managers combat the condition that 85% of their problems arise from individual users doing something wrong with their computers? How can system monitors know exactly where a line break occurs, or knows whether a system fault is inside or outside the house? The answer is network management. Data networks are miracles of remote control. Network management facilities find faults, anticipate faults, measure performance, perform security operations, configure devices remotely, and accumulate data for accounting and other purposes. These protocols are also standardized so any piece of equipment installed in a network will interoperate with a centralized control and management stations. Furthermore, given the general nature of communication systems, these control centers can be accessed remotely, from anywhere in the world given the ubiquity of the Internet, enabling customer service from anywhere in the world.

Network management enables Service Level Agreements (SLA) between carriers or ISPs and key customers, usually businesses. SLAs specify data rate guarantees, availability guarantees, service response times, error rates, network caching if desired, Virtual Private Network (VPN) access, security, and email capacities and addresses among other things. There are especially important for provisioning and maintaining data centers that see terabytes per second passing into and out of their huge warehouses of computers.

While not exactly a part of network management per se, IP networks are self-healing. They arose from a 1960s project of the Defense Department to create a network the destruction of any one part of which would not affect the workings of the others and data routing would be automatically reconfigured. The answer was organizing data into packets that could be interleaved on a network link with other packets going other places, but with routing depending upon end addresses rather the specification of a path through the network, the way original telephone systems worked. Switches within such a system talk to each other. If one fails, adjacent switches simple rout data packets to another adjacent switch. If large sections of switches fail (or are bombed), packets that would have passed through them are also re-routed. There are obvious limitations to this; any switch on the very edge of a network that connects to last mile networks cannot reroute packets from a last mile line. The common anodyne is redundancy at the end offices, battery backup if power fails, and secure physical facilities. Network management systems understand immediately when a switch fails and orders service as quickly.

What Value for Other Problems?

One could find imperfections under the hood of this machine, but its product has no peer in human efforts to organize technologies for human benefit and development. Among its current virtues is that it is the most global instrument we have ever created. It knows no borders, and it knows no conflicts among participating nations (which is all of them) relative to the technology itself. Its use is problematic, everywhere. But we are not going to throw it away because some nations allow cyber terrorism, some nations restrict access and content, and broadband is now recognized as a contributor to isolation, loneliness, violence, cruelty, crime, and abuse.

Most of our serious problems are very complex, immune to simple answers or one-shot fixes. Broadband worked out these modal difficulties by breaking the problem into reasonably discrete and independent components, tackling each separately, but with the highest quality minds working in formal collaborative committees at each segment who had a vested interest in the outcome. The industry furthermore orchestrated a parallel system for managing itself, one as carefully organized as the system it manages, all operating within the system itself. There is no battery of consultants like McKinsey waltzing in with smooth answers and then leaving with no responsibility for the outcome. There’s an old analogy: relative to ham and eggs, the chicken’s involved, the pig is committed. Those working within telecommunications are committed.

What if we applied this model to health care? The first act would be to isolate factors of health care that can be treated independent of the others. What comes to mind immediately are electronic medical records, radiology, billing management, nursing standards, emergency room procedures, wellness protocols, patient reluctance to follow advice and doctored-ordered practices, patient monitoring using electronic attachments, ER access and pass through, drug testing protocols, user information access,