Behavior Management vs. Technology Management

A question came up on one of my mailing lists about what some school districts do to manage students’ personal devices connecting to district networks.  The inquirer’s administration wanted to shut down all guest wifi access as a way to curb social media bullying and other antics, with the assumption that giving students the alternative of having to use their plan minutes would be sufficient as a deterrent.

This seems like an overreaction instead, as well as relying on external forces to manage internal forces.  This is a behavior management issue, as the technology is a tool being misused.  While reducing or eliminating access to those tools would address the issue, how does that impact the rest of the population?

If a handful of students are misbehaving and misusing the technology made available to all students, and that technology is taken away from all students, what are the consequences?  If guest-access wifi is removed, not only does that impact the entire student body, but also any guest speaker, parents, business contacts, and even visitors from neighboring schools.

Drunk driving could easily be resolved by banning all vehicles and alcohol.  Is that the right solution, though?

If there are flies buzzing around from something that’s spoiled in the kitchen, do we remove the entire kitchen?  Or do we hunt around for the rotten meat that fell behind the stove, which is where the flies have congregated?  Yes, it more work to trace the source and to clean up the meat, but it doesn’t go overboard by demolishing the whole kitchen.

“But I’ll Never Remember a Complicated Password!”

An article about password security, aimed at the average person or IT people who work with average people and need another way to explain it.

Social engineering seems to be the easiest way to grab a user’s password, and despite suspicion on the part of IT staff, the average user gets roped in pretty quickly. Phishing attempts are getting bolder and more sophisticated, and objectively speaking, I have to applaud some of the efforts because they’re pretty good.

Not much can be done to secure an account if the account holder willingly gives it up. But mitigating damage from brute-force attacks and even “shoulder surfing” can be much easier.

Use a phrase or sentence.

Mix in a few capitalizations, maybe even skip a character. Try it out here:

https://password.kaspersky.com/

In a nutshell, the more characters there are to try to figure out, the harder it gets mathematically to solve. For illustration’s sake, with a single-digit number, you have a 1 in 10 chance of getting it right. For a human, it’s pretty simple. For a computer, it’s instantaneous.

Add another digit, and the odds increase to 1 in 100.

For a single letter of the English alphabet, the chance of “cracking the code” is 1 in 26. Add another letter (where repetition is allowed) and the odds are 1 in 676 (26 x 26).

Still pretty easy for a computer.

While words of multiple characters and numbers are more complex, computers are able to use dictionaries and heuristics to figure out a password (“heuristic” being essentially the practice of starting with “most likely” then working outward). Add in behavioral analysis (especially through mining big data acquired via social media), and a computer can easily figure out the most popular passwords for a school teacher at this time of year:

  • summer
  • summer1
  • Summer123 (this of course being the most complex…..)

Seriously. We went through a lot several years ago to get our teachers to stop doing this.

Conversely, a password like S&4u_sO9%8sS8^2HhYvoO is nearly impossible to crack….but also nearly impossible to remember. Plus, not every system out in the wild can handle all of these requirements; some systems can’t handle special characters, others can’t handle certain special characters, and I still see systems that demand a maximum of 8 characters (in 2017!!!).

A comic to explain more technically

The balance between a password complex enough to make it difficult for computers to crack but simple enough for a human to remember is to use a sentence or phrase. Artificial intelligence is not yet at that point where a system can guess the meaning and impact of certain terms.

Try “My Cat is 17 Years Old” (author’s note: my cat is not 17 years old…I don’t even have a cat……… or do I?).

This isn’t to say that a computer won’t find out your password within seconds or minutes. It could be very, very lucky. But the chance of that happening is incredibly slim. Much slimmer than if your password was cat17. But not as slim as King Illegal Forest to Pig Wild Kill In It A Is.

An Analogy from an Unlikely Source

Following up from my previous post about using analogies to describe technical terms to non-technical people, I heard what might be the best analogous circumstance from somewhere I didn’t expect.

A cable shopping channel.

On QVC this morning, as we were flipping channels looking for programs for the kids, I landed upon a segment selling HP laptops. And despite one of the hosts’ scripted, insincere sincerity and exuberance describing technology to viewers in the guise of a “conversation” with his cohost, he (or his writers) used the analogy of a restaurant kitchen to describe the internal workings of a computer.

The multi-core CPU was the chefs in the kitchen. It split duties among them all in order to manage specific tasks simultaneously, and the more chefs one has in the kitchen, the more duties they can govern. Head chef, sous chef, commis chef, etc. Each can be likened to one of the cores of a processor, handling its own assigned task both independently and in conjunction with the other cores to complete the overall task (the meal).

The RAM is the counter or work surface. The greater the amount of RAM in the computer, the larger the work surface in the kitchen, which means the more ingredients and dishes can be placed at any given time before they have to be moved out to their next destination. If a counter is only 1′ by 3′, there’s a limit of the number of plates that can be placed there before another process (the servers) have to move them out, which means delays if the servers are busy. Conversely, if the counter is 3′ by 16′, then not only can more plates be placed on the counter, freeing the cooks to continue working, but servers don’t have to wait for dishes and more dishes can be placed for expediting.

The hard drive space is the size of the walk-in and dry-goods storage, which is pretty self-explanatory.

Admittedly, despite my disdain for the cable shopping channels simply because of the mind-numbing superficial presentation of the shows and hosts, this particular analogy was actually brilliant. I may not be in the business of consulting with my colleagues for home computer purchases, but the reality is that I get asked quite frequently. And because of my innate expertise, I’m going to help them out, because why wouldn’t I?

I’m going to use this restaurant analogy now.

Specializing or Generalizing?

We humans have a limited amount of resources available for pretty much anything. Whether it’s money, time, energy, general give-a-darn, we don’t have infinite amounts of it. There is no limitless supply of money/time/energy/give-a-hoot waiting in the wings, available for anytime we need it.

So too does that apply to determining what to do with one’s IT career.

In the distant past, when technology was still new, and support was in its infancy, we IT pros had to generalize to survive. We were expected to know everything about everything — networks, operating systems, software, cabling, databases, security, programming. You name it, we were expected to know it. And what was the end result?

Mediocrity.

The problem is that we have limited time and ability to learn all there is to know about everything within the time constraints we’re given. In the analogy of the ice cube tray, our resources are illustrated by a pitcher of water (not a faucet). From that pitcher, we can choose two paths:

  1. Fill all the cube receptacles all at once and see what level they reach, or
  2. Fill each receptacle sequentially, knowing that there will be empty cells.

With #1, we ensure that every cell or receptacle or cup as at least some water. With #2, we ensure that we can fill as many as possible to the top, but we know that others at the end will remain empty.

This is the illustration of generalization (#1) vs. specialization (#2). So which is “better?”

That all depends on your own set of values. Personally, as a student of life, I love to know a bit about everything so that if a topic ever comes up in conversation, I can participate to some degree. I love science. I can discuss some literature, art, and music. I play a few instruments. I know quite a bit of history. I can teach someone the basic fundamentals of calculus using AD&D terminology.

But I’m far from an expert in any of those fields. This is because I filled my ice cube tray from that pitcher all at once, to try to get some water in at least every cell.

I know some brilliant musicians, but who are terrible at cooking. I know phenomenal artists who are awful at math. And of course, I know plenty of savant-like IT people in various specialties who are clearly deficient in other areas of life.

These are the people who’ve concentrated on filling one cell in the ice cube tray from the pitcher at a time, to make sure that they are using all the available resources to fill that one cell to the top before moving on to the next.

Now, obviously, there are pros and cons to both approaches. Neither is better than the other, because again, it all depends on what you value more. As an IT professional, generalization does make it a bit more difficult to land that prestigious job or even do some more focused job searching, while the specialist knows what he or she is targeting and what to get….but the opportunities are further and fewer in between.

The generalist can probably find more opportunities to submit applications, but yield fewer interviews or offers. The specialist may find very few application opportunities, but the ones they do land, interviews almost seem to happen right away. The generalist might have greater flexibility in moving from field to field while the specialist is stuck in a handful of areas of expertise.

There are benefits and drawbacks to both approaches. Which you select depends on what you value more, and your tolerance for risk.

Using Analogies

Having been involved in technology pretty much my entire life, tech concepts come pretty easily to me. But I also understand that it’s not the case for everyone, regardless of age or exposure. For example, we may often wonder — especially from my generation — why younger folks seem to have difficulty with technology when they’ve grown up with more exposure than we have.

That can be countered simply by saying that we, of my generation (born in the ’70s), have grown up exposed to cars all of our lives….but we are not necessarily qualified to repair them.

Exposure doesn’t equate expertise.

This is one of the examples of analogies that I like to use to bridge gaps in understanding when it comes to technology. Not everyone I encounter in my role in IT is going to be up to speed on all aspects of tech (that’s why I exist). But framing technology into terms that mirror ubiquitous concepts, like cars or the medical profession, seems to help open up the listener’s mind to the possibilities in the conversation.

By framing technology into familiar concepts and terms, audiences can start to see that technology is not as intimidating or daunting as first feared; that, like the other familiar concept, it can be learned and understood one piece at a time. Similarly, even complex technological concepts can be broken down and more easily digested once the fundamentals are understood through the use of analogies.

Another parallel conversation regarding information technology can be the profession itself. Young professionals and students exploring careers in IT may not be aware of the range of choices and specialties that fall within the broad umbrella of IT. Thus, it can be tempting to ask “what’s the best way to break into IT?” or “what degrees/certifications do I need in order to become an IT professional?”

Unfortunately, a question like that is akin to asking “how do I become a medical professional?” without specifying what professional study in medicine or even a specialization. In medicine, one must choose a professional path (doctor? nurse? research scientist? EMT?) and specialization (trauma surgeon? cardiology? rheumatology? orthodontics? optometry?), and that in turn will determine the education, certification, and residency requirements in order to fulfill those goals. IT is no different — choose a path, choose a specialty, then determine the requirements to achieve that goal.

Despite its commonplace existence, IT is still in a way considered a nascent industry. Thus, understanding the underpinnings of “the job” is not nearly as commonplace as presumed. Using analogies to equate the aspects of information technology fosters that understanding and learning about our profession by those who are not as familiar with it, without taking on a haughty and arrogant attitude that seems unfortunately too commonplace within our peer group.

LikeUsing AnalogiesCommentShareShare Using Analogies