Share on Social Media:

FORMAL SCHOOLING VS YOU: PART III

What do you hope to get out of formal schooling? More specifically, what do you hope to get from higher education? Is it a lucrative career? A well-rounded personality? The ability to converse with almost anyone about almost any topic? Acceptance in certain social circles? The chance to meet people who can help you succeed?

Whatever your goal, it’s entirely possible to meet it without spending years in a stifling classroom environment, and without piling up tens of thousands of dollars in debt.

High school students are told, over and over again, that a good career is impossible without a college degree. But is this true? Most often, the time spent in pursuit of a degree would otherwise be spent in the labor force, in travel, or in business ventures. Whether they succeed or not, these pursuits are learning opportunities. After a few years of them, assuming we’ve put forth reasonable effort, we are likely to have contacts, referrals, and experience producing products or services of real value. These traits are often valued more highly by employers than college degrees.

Even if we don’t learn much through formal schooling, the degree is a necessary credential, isn’t it? Don’t the best jobs require degrees? This may have been true for many years. When college graduates were rare, degrees may have indicated unusual merit. With millions of baccalaureates flooding the job market every year, though, the degree means less than it once did. The degree is a much less reliable signal of experience, knowledge, or effort than it once was. Many employers, therefore, are looking for other measures of career fitness. A LinkedIn profile and five minute Google search may reveals more about an applicant’s communication ability, work ethic, and commitment to completion of tasks than a degree will.

At the very least, some would say, the formal schooling environment offers effective networking. This is questionable, though. College students typically spend most of their time with people of roughly the same age. Most of the student’s acquaintances are studying the same subjects, and are doing the same things with their time. Almost nobody the student knows is active in business or the labor market. His social network is too narrow to benefit him very much. If he wants to cultivate contacts that will help him find employment, the college environment is the wrong place for it.

Whatever you hope to get from higher education, the formal school setting might not be the best place for it. If you look for them, you are likely to find other ways of meeting your goals. And these other ways are likely to cost much less- in time and money.

(To get the most out of informal learning, you need a reliable broadband connection. Talk to us. We can help.)

Share on Social Media:

SCHOOLING VS. EDUCATION: PART II

In an earlier post, we explored the difference between formal schooling and genuine education. Now we will examine a few alternatives to the conventional higher education system.

To begin with, we need to ask ourselves what we hope to gain by education. Is it enlightenment? To this end, are we studying the humanities: philosophy, history, and the arts? Do we see education as cultivation of mind and soul? Do we see it as a means of making us more aware and responsible citizens, with our employment prospects a secondary concern?

Do we see education as an investment? If we do, is there a direct connection between the amount of formal schooling we have, and our value in the labor market? Can this be measured?

Even if we don’t know more, nor have more advanced skills, than when we started, is higher education worthwhile because of the career credentials it confers?

Zachary Slayback says that in any of these cases, we can reach our goals without the formal school system. Slayback is the author of The End of School: Reclaiming Education from the Classroom. In his book, he argues that formal schooling is not necessary for humanities study, for vocational skill development, or for credentials. There are other ways to reach these goals. We don’t have to pile up massive debt, nor drop out of the labor force for years.

In the first place, most universities do a poor job of teaching the humanities. Higher education has largely been hijacked by cultural Marxists, and the curriculum reflects their ideological hobbyhorses. For many people, online courses are a better option. The Great Courses, for example, offers video and audio instruction in more than 500 subjects, including science, math, philosophy, history, and the arts. ‘Tuition’ ranges from less than $50.00 to about $250.00 per course, depending on complexity. All courses are taught by qualified professors known to be experts in their subjects. Students learn at their own pace. The student can get nearly everything necessary for a university degree from The Great Courses, at a small fraction of the expense.

Formal schooling may not be necessary for lucrative job skills, either. Apprenticeship programs can help in developing valuable trades. Praxis (discoverpraxis.com) offers a one-year program, ten months of which are paid work for a startup company. At the end, the student has a portfolio demonstrating drive, focus, and experience in creating real market value. This is worth more to most employers than a degree is.

Some people say formal schooling is necessary for job credentials. Even if the student hasn’t learned much in the university, they say, the paper certificate still opens doors. Many employers, though, consider a degree a much less reliable indicator of work ethic or relevant skill than it was a few decades ago. Ernst and Young, one of the most prestigious accounting firms, no longer requires degrees for new hires. An applicant with a demonstrated record of producing valuable products and skills may have a leg up on the degree holder without such a record.

((One of the essential tools necessary for informal learning is a reliable internet connection. Talk to us. we can help.)

Share on Social Media:

Sling TV’s Multiple-Stream TV Service

Dish Network launched Sling TV, its dedicated streaming video platform, early last year. It was a revolutionary idea for the often-complacent pay TV industry. A satellite system operator was offering a semi-independent internet video streaming service. The customer would not need the customary contract, would not have to sign a long-term commitment, and would not have to schedule an installation. The customer would not need a satellite dish or a dedicated TV set-top box.

Sling TV could be streamed to a wide variety of devices. These include Mac and PC computers, iOS and Android tablets and phones, almost every dedicated video streamer, and several gaming consoles. There are very electronic devices Sling TV will not support.

In other respects, Sling TV would resemble a conventional cable or satellite TV service. It would carry multiple channels in its core package, including major commercial broadcast stations. The basic channel package would be much smaller than the typical pay TV package, though, and would cost much less.

If there was any major drawback with Sling TV, it’s that it was limited to one stream per household. On Wednesday, April 13, Dish addressed the matter with a new ‘multi-stream’ service (now in beta tests).  The customer will be able to stream it to up to three devices at a time. At its launch, the multi-stream selection in the basic package was limited to a few FOX networks: Fox Sports, FX, and National Geographic. Optional premium channels available in multiple streams include A&E, AMC, EPIX, HBO, Scripps, Turner, and Univision.

The channel selection available for multiple streams is likely to expand over time. Dish Network is negotiating with content providers, and expects to offer a far more channel options within a few months.

The basic twenty-three channel Sling TV package sells for just $20.00 per month. Several optional ‘Extra’ programming packages are available for $5.00 per month each.

One average, TV bills for Sling TV customers are about half the size of cable bills.

(For any internet video streaming service, you need a good broadband connection. Talk to us. We can help.)

Share on Social Media:

PAY TV’S WEB REVOLUTION

The Federal Government wants to require cable and satellite TV service providers to ‘unlock’ their set-top boxes.  Under an ‘open-source’ standard, if you want to drop one provider’s service, you can use the box you already have for a new provider’s service. You don’t have to buy or lease a new box.

Some pay TV system operators are saying that the box will soon be obsolete, anyway. They say that changing the technical standards for set-top boxes will be futile. It is an unnecessary expense, they say, and may even be counterproductive, since almost all TV content will soon be streamed over the internet.

Last year, Dish Network demonstrated the market power of  a dedicated multichannel web streaming platform with Sling TV. The new service has been popular, and now has more than 600, 000 subscribers. Sling TV carried just seventeen channels at its launch, but now carries more than sixty, including premium movie channels. Its basic twenty-three channel package sells for just $20.00 per month.

Other pay TV providers, observing Sling TV’s success, have entered the internet video streaming market. Verizon Wireless has pursued the mobile market aggressively with its Go90 platform. Comcast has tested its XFinity Stream IP TV service in selected markets, for the purpose of bringing broadband users who aren’t pay TV customers into its TV System. Time Warner Cable has tested TWC TV, an internet-only TV bundle, in New York City. For now, only TWC’s broadband customers can get TWC TV, but the company wants to offer it to others soon. Sony’s PlayStation, once strictly a gaming console, now handles streaming video with the Vue upgrade.

More vendors will follow. Within two or three years, the set-top box is likely to become a relic of the past, as almost all multichannel video service providers will be streaming their content over the internet.

(For streaming video, you need the right internet connection. Talk to us. We can help.)

Share on Social Media:

 

EDUCATION VS SCHOOLING: PART 1

Is school getting in the way of your education? Very likely it is, according to Zachary Slayback, the author of The End of School: Reclaiming Education from the Classroom. His point seems to be reinforced by recent news from many universities. We hear about ‘rape culture’ witch hunts which ruin the careers and reputations of innocent men. Crusades against ‘microaggressions’, and student demand for ‘safe spaces’ free of contrary opinion, threaten freedom of speech. Harsh but vaguely defined ‘harassment’ codes are veiled attempts to control speech and publication. The proliferation of transgender studies, women’s studies, ethnic studies, and other such vacuous fields corrupts and infantilizes the curriculum. Faculty are ideologically unbalanced, and humanities instruction has devolved into political and social polemics.

It often seems that most universities are little more than boot camps for aspiring Stalinists.  For anyone seeking genuine scholarship or the free exchange of ideas, the modern university can be a perilous environment. K through 12 education isn’t much better.

Slayback says, though, that the failings of higher education didn’t develop just in the last few decades. The very structure and purpose of the conventional university is at fault.

In the nineteenth century, very few Americans got any formal schooling beyond high school. Most university students were children of the wealthy. Higher education was the primary means of helping them fit into the upper crust in society. The humanities dominated college curricula, because concern for such matters was one of the most important marks of the gentleman.

The twentieth century saw the rapid expansion and rise of the middle class. The rising middle class fostered a skyrocketing market for higher education. Ambitious climbers, and parents concerned for the social and career success of their children, noted that the most successful men in America usually had college degrees. From this, millions assumed that the degree was necessary for success. Generations of politicians, anticipating political profit in treating higher education as an entitlement, were happy to promote the idea.

After World War II, with the G.I. Bill providing public tuition funding for military veterans, millions of new students flooded America’s colleges and universities. Demand for higher education exploded. That it must be indispensable for success was well on its way to becoming settled orthodoxy.

Zachary Slayback says this core assumption is wrong. As scientists often say, correlation is not causation. It’s true that the successful usually had college degrees. Slayback says, though, that family wealth usually led to higher education, not the other way around.

Slayback says that we need to rethink our assumptions about the need for formal schooling, and about how we learn.

In a future post, we will explore several alternatives to conventional colleges and universities.

(For the school of the future, you need the right information technology. You need the right internet connection. Talk to us. We can help.)

Share on Social Media:

IS TECHNOLOGY ACCELERATING?

It may seem to you that the pace of technological development is moving faster than your ability to keep up. Are you just imagining this?

According to some of the world’s leading experts in technology, it’s not all in your head. Our tools and industrial processes are changing at an ever faster and faster rate. Ten years ago, you didn’t own a smart phone. Video services on mobile devices were unheard of.  Thirty years ago, very few people owned personal computers, and digital information was nearly the exclusive possession of government and business elites. Today, you carry the entire store of the world’s knowledge in your hand.

According to Ray Kurzweil, author of The Singularity Is Near, the pace of technical innovation really is gathering speed. You may have heard of Moore’s Law. It’s named after Gordon Moore, who said in 1965 that the number of transistors in an integrated circuit would double every two years. So far, his prediction has proven to be accurate.

Kurzweil says that Moore’s Law applies to more than computer circuits. The same principle, he says, applies to technology development in general. For example, DNA sequence data has increased about ten million times since 1982, bandwidth in the internet backbone has grown by about 10 billion times since 1985, and the performance-to-price ratio for wireless devices has increased by nearly a million times since 1990. There are many more examples. A wide range of technologies increase capability by millions, even billions, of times, in just a few decades and at dramatically lower prices.

Kurzweil calls technical development an evolutionary process. As in biology, ‘natural selection’ means that advantageous development is passed on to our technological ‘offspring’. Not having to start from zero, we build on what’s been done. Our tools, like living organisms, become increasingly complex and increasingly capable. As Kurzweil put it: “Evolution applies positive feedback. The more capable methods resulting from one stage of evolutionary progress are used to create the next stage.”

Technology follows the iron law of accelerating returns. Each generation stands on the achievements of its forebears. Each generation adds its own improvements, enabling the next generation of even greater achievement.

(To get the most out of technology, you need the right information tools. Talk to us. We can help.)

Share on Social Media:

SOCIAL MEDIA AND PRIVACY

If you spend much time online, your privacy is unsafe unless you take steps to protect it. What may be even more dismaying is that the rules governing online privacy are inconsistent. They inhibit only a few of the worst potential violators, leaving others free to vacuum up as much of your personal data as their technologies allow.

Last week, the Federal Communications Commission unwittingly underscored this inconsistency. Tom Wheeler, the FCC Chairman, announced a proposal for imposing strict new privacy rules on internet service providers.  From the consumer’s point of view, the proposal was a huge step forward, as ISPs would have to protect personal information, report breaches, and obtain consumer consent for personal data collection. Consumers would have to ‘opt in’ to allow collection of personal information. The new regulations would make it more difficult to use consumer data for targeted advertising.

Unfortunately, the new rules would exempt Facebook, Twitter, Google, and other browsers and social media. The American Civil Liberties Union expressed disappointment with the proposed new rules, and other consumer groups gave them only qualified endorsement. Some ISPs panned the proposal. AT&T, for example, called it discriminatory. The telecom giant objected that broadband providers would be held to stricter standards than other online companies.

Since the FCC won’t do much to protect you, you have to protect yourself when using social media. Consider using an ad blocker. Carefully review the privacy policy of any social website you visit.

You need to be vigilant to guard your privacy on any social medium. Some websites change privacy settings frequently, without notifying users. Facebook is especially notorious for this.

If you find that your privacy settings have been changed without your consent, change them back. Then send a complaint to the site administrators. This will not guarantee that the site’s policies will change, but it may help. If enough users complain, administrators may finally pay attention.

Above all else, remain alert. The best safeguard for your privacy is your own common sense.

(For the internet service that meets your needs, talk to us.)

Share on Social Media:

 

THE THIRD GREAT LEAP

Are we on the verge of the third great technological leap in human history? Some economists and inventors say we are.

The First Great Leap, about 7, 000 years ago, was the development of agriculture. The hunter-gatherer societies that had existed until then were small, unstable, and at the mercy of the elements. People had to move frequently to follow game.

With agriculture, the human race developed a degree of control over nature. In planting and harvesting crops, we could build up food surpluses. The surpluses became a foundation for credit and trade. In domesticating animals, we had predictable supplies of meat, hides, milk, eggs, and wool. With predictable food supplies, permanent dwellings became practical, and man built the first cities. As trade accelerated, we built up further surpluses, which encouraged greater division of labor and some leisure time. This fostered sophisticated religion, philosophy, entertainment, scientific inquiry, and the arts.

The Second Great Leap, the Industrial Revolution, occurred about 200 years ago. Man’s output would no longer be limited to the product of his own muscles or the muscles of his livestock. With the invention of reliable steam engines, then electrical power, man could multiply his productivity many times beyond what was possible with muscle power alone.

The Industrial Revolution multiplied wealth for the masses. An ordinary citizen in America or Western Europe now enjoys comfort, leisure, and mobility that were unavailable even to royalty two centuries ago.

The Third Great Leap is the information revolution. We are on the cusp of it now. Computer technology has come a long way in the last forty years, but still is primitive compared to what it soon will be. The internet, scarcely dreamed of a generation ago, is still in its infancy.

The third leap is the use of information for more than training and education. We are about to use encoded information routinely to manipulate physical reality. With a VR headset and a control console, someone in Spain controls an earth mover in Sweden. A surgeon operates on a patient remotely, with robots cutting more precisely than his hand. A factory manager in Phoenix controls production in Tucson, with no staff on site in the Tucson factory. He can monitor and address any problems in real time.

Some of the most important emerging technologies include virtual reality, 3D printing, gene editing, and the ‘internet of things’. Sensors will be nearly everywhere. If we want, we can have nearly constant feedback about nearly everything in our environment.

Some experts believe the Third Great Leap will multiply average productivity more than fifty times within a few decades. If this happens, nearly all of us will be much richer. We could easily pay off the national debt. We would have cheap and abundant energy. We could solve problems that seem intractable now.

We cannot know now exactly how the Third Great Leap will affect us. We can make only the vaguest of guesses. It will, no doubt, bring us many new problems as well as opportunities. At any rate, we can be sure that our lives will be very different.

(To get the HughesNet data service that’s right for you, talk to us.)

Share on Social Media:

IMMORTAL INFORMATION

Can information be immortal?  Scientists at the University of Southampton (UK) say it can be.

For most of us, this claim would sound preposterous. We know that any medium we write or draw on, or encode our data on, will be destroyed over time. Heat, humidity, and chemical breakdown will rot paper. Clay tablets crumble. Stone breaks, and is eroded by the elements. Ferrous metals will rust. Celluloid melts under high heat, and decomposes as the chemicals that form it break down. The substances that form our DVDs and hard drives may last longer, but they too are subject to the relentless process of decay.

Finding a truly permanent data storage medium has been one of the great quests of the Information Age. A few weeks ago, scientists at the University of Southampton’s Optoelectronics Research Center demonstrated just such a medium. It is by far the most permanent and versatile data storage and retrieval method yet.

The new storage medium is a nanostructured glass disc made of fused quartz. A femtosecond laser writes the data onto the disc. The US ORC team calls its new data-writing method “five-dimensional”, based on the three position dimensions, plus orientation and size.

Each disc, slightly larger than a U.S. quarter, can hold 360 terabytes of data. It will last for 13.8 billion years, approximately the age of the universe, at 374 degrees Fahrenheit. At room temperature, it will be nearly immortal. The molecular structure of the disc will remain stable at up to 1832 degrees F.

Reading the disc requires shining a light through it, then measuring the resulting data with an optical microscope and a polarizing filter. Experiments in 2013 proved the feasibility of this method with a 300 kilobyte file. The method has been refined since then, and now can accommodate files more than a million times larger.

It is possible now to record the entire history of civilization, without concern about limits in storage capacity, or decay of the storage medium. The Southampton ORC gave UNESCO an immortal ‘5D’ disc with the Universal Declaration of Human Rights. The Christian Bible (KJV), the Magna Carta Libertate, Newton’s Opticks, and other important historical documents have been stored on ‘5D’ discs.

Does this mean the story of your life will be immortal? It might. Be careful how you live. Your cat videos, your social media posts, your financial records, and your behavior at bars may be preserved for future generations to puzzle over.

It isn’t just glory that could live forever. Embarrassment could, too.

(Do you have enough bandwidth for your data needs? Talk to us. We can find the plan that works best for you.)

Share on Social Media:

MEMORY BY GOOGLE

Have you ever forgotten a business appointment? Have you ever forgotten your spouse’s birthday? Have you ever forgotten your most important point while briefing your boss about a critical project?

Memory often fails us when we need it most. Within a few years, though, you might not need it. Machines will remember what you need to know.

Last month, IBM patented an algorithm it calls an “automatic Google for the mind”. It could track your behavior and speech, analyze your intentions, and, discerning when you seem to have lost your way, offer suggestions to prod your memory. Dr. James Kozlowski, a computational neuroscientist for IBM Research, is the lead researcher for the automated memory project. Kozlowski says he helped develop his company’s new ‘cognitive digital assistant’ for people with severe memory impairment, but it could help all of us with research, brainstorming, recovering lapsed memories, and forming creative connections.

IBM’s new cognitive tool tackles the most common cause of memory failure: absence of context. Memory, for most of us, is a web of connections. Remembering a single aspect of an experience, we can call up others. To remember is to find the missing piece in a puzzle. If you can’t find the first clue, you can’t find the second, and you don’t have a mental map for the information you need.

Dr. Kozlowski says IBM has found the solution for our memory failures. His cognitive assistant models our behaviors and memories. It hears our conversations, studies our actions, and draws conclusions about our intentions from our behavior and speech patterns, and our conversations with others. From this data, it can discern when we have trouble with recall. It then will guess what we want to know, suggesting names and biographical data within milliseconds. By studying our individual quirks, it will learn what behavior is normal for us, and when we need help.

Synced with your phone, the automated cognitive assistant would search its database of phone numbers to find out who’s calling you. Before you answer, the assistant will display the caller’s name, highlights of your recent conversations, and important events in the caller’s life. At a business meeting, your digital assistant will, on hearing certain words, recall related points mentioned in past meetings, and your research on the subject. It will display them on your mobile device, or ‘speak’ them into an earpiece.

It’s likely to be several years before IBM’s automated cognitive assistant is in common use. A few bugs stand in the way of commercialization, but it’s still an impressive achievement.