Frank Bruni recently had an excellent op-ed in The New York Times. But the title—“How Facebook Warps Our Worlds”—is rather misleading. It would have been more appropriate to go with something along the lines of: “How We Warp Our Own Digital Worlds.”

Bruni’s article comes on the heels of the recent conservative backlash against reports that Facebook allegedly engaged in curating its Trending Topics news feed to excise important conservative-leaning stories. The stories that have emerged in recent weeks have tackled the issue ad nauseum. Bruni, however, approaches the issue from a slightly different angle, which merits serious consideration and reflection. In the op-ed, he opens by candidly hoisting the blame not on the “unseen puppet masters” curating Facebook’s news feeds, but on all of us. Because, as he notes:

When it comes to elevating one perspective above all others and herding people into culturally and ideologically inflexible tribes, nothing that Facebook does to us comes close to what we do to ourselves.

This is important, especially in the wake of Facebook’s changes to Trending Topics, all of which amount to expanded help desk assistance and “training” for employees. This should come as no surprise, however. After all, what else can Facebook do in the face of conservative upheaval? The algorithms used to aggregate information into social media feeds must necessarily be supplemented by human filtration. There has never been an “automated process” by which social media feeds are curated. There are no “neutral, objective algorithms.” Rather, the Internet is essentially a digital mirror of ourselves.

For the perfect example of this, look at Microsoft’s recent experiment with Tay, the AI Twitter bot that ended up going on racist, misogynistic rants after “learning” the behavior from the social media ether. Tay mirrored what it heard from Internet denizens. As Bruni notes:

The Internet isn’t rigged to give us right or left, conservative or liberal—at least not until we rig it that way. It’s designed to give us more of the same, whatever that same is: one sustained note from the vast and varied music that it holds, one redundant fragrance from a garden of infinite possibility.

Our ideas, world views, and expectations are more often reinforced by our own echo chambers, not challenged. This reinforcement, Bruni argues, “colors our days, or rather bleeds them of color, reducing them to a single hue.” The result is a hodgepodge of “precisely contoured echo chambers of affirmation that turn conviction into zeal, passion into fury, disagreements with the other side into the demonization of it.” The Internet thus mirrors society and, by extension, each of us: our preferences, associations, and world views. In a sense, our online tools are just means by which we transpose our lives into the digital realm, and those tools, wonderful though they may be, are just that: tools that we can use to shape our online bubble. Google may be great at allowing us to access information we previously did not know, or refutes our long-held beliefs. But there’s a chasm of a difference between being confronted with evidence suggesting our perspectives are ill-informed, and internalizing that information and readjusting our beliefs accordingly. Simply having access to more information doesn’t necessarily make us better informed. Bruni is correct in noting that it simply takes “a creative or credulous enough Google search” in order to find “a self-driving ‘truth’ … along with a passel of supposed experts to vouch for it and a clique of fellow disciples.”

Although I agree with Bruni on most counts, I disagree in part with his ultimate conclusion:

It’s about a tribalism that has existed for as long as humankind has and is now rooted in the fertile soil of the Internet, which is coaxing it toward a full and insidious flower.

It’s true that our self-reinforcing ideological bubbles online are fundamentally driven by that age-old tribalism. No doubt. But it’s a tougher sell to argue that the Internet’s soil was once a “fertile” tract of non-ideological tabula rasa. And an inevitable and “insidious” flowering of what, precisely? Human nature asserting itself? Why would we expect the realm of digital bits to be fundamentally different from the world of atoms?

The Internet, at its most basic level, is not simply interconnected networks. Nor is it the fiber optics cables and wires that transmit packets of binary code. It’s not even those simple “series of tubes.” The Internet is us. It is human minds connected to one another with the aid of mind supplementing devices we call computers.

It is important for people to engage with one another, for ideas to disseminate and be discussed in the public forum, and for individuals to engage in sober reflection upon the facts with which they’re presented. Unfortunately, it doesn’t always happen. However, it is hardly a phenomenon unique to the Internet, or to modern society more broadly. These issues are, Bruni correctly identifies, part of “a tribalism that has existed for as long as humankind.” Thus has it always been, and thus shall it always be. The Internet—at least the Internet each of us experiences in our own unique way—is simply a reflection of ourselves. We should not be surprised then when cyberspace starts becoming a reflection of meatspace. Nor should we lament it.