UX failures: A BOTTOM TEN LIST

IMG_0132By Robert R Glaser, UX Architect

I make mistakes. Some of my greatest successes, in fact, most of them, all came from lessons learned. It would almost seem that any success that didn’t come from failure came from luck. For failure taught me through experience what would happen in very concrete terms, when I failed.

We are at a deluge of “Top 10 lists”. This list worked out to match that 10 through coincidence and not by intent. I tent to loathe the common facile top ten list as clickbait or worse a set of ostensible and rudimentary roles that should have been learned in the “101” level of any area of study, much less expertise. Sometimes 10 is too many, and far too often, it’s not enough. I think the thing that irks me about this trend is the implication that expertise can come from reading these lists. Notwithstanding this list below. It is a recipe approach to problem-solving. In this case, in UX. This recipe approach provides answers to specific problems but never with enough background to really understand whether the solution your about to apply will actually solve the problem or do so without causing other cascading problems.

So I thought I’d make a list of common problems caused by “instant expertise” solutions of these oversimplifications and maybe provide a way of avoiding them by providing some direction rather than solutions.

I intentionally didn’t put illustrations here, since this article is meant to be thoughtful and to let you think of your own examples. Often when pictures are provided, people tend to draw conclusions too quickly and without a thorough understanding of the ideas.

1. Anything that can be distilled to a top ten list, is cursory at best.

The internet is bursting with life hacks, memes and other suggestions that seem so amazing that anyone would have even thought of it. At the same time, so many of these suggestions for easier, faster, cheaper, better, solutions are often none of those epithets (or maybe just one.)

It gets worse when you start focusing on a particular field of work like UX (but by no means limited to it.) In these we are barraged with rules of thumb that are overgeneralized, or too specific, or highly limited, or so obvious that if you have to be told, you are in the wrong area of work or study.

Would you go to a Dr. who refreshed their knowledge of current medical practices by looking at a top ten list of missed symptoms of a stroke? These lists are actually and commonly recommended as an approach to successful articles and blogs. I rarely ever come across a list where I didn’t find myself thinking after reading only one or two of the list items and thinking “except when… ” followed by countless examples. This is not necessarily the often overly simple statements but rather the often-intractable completeness of them. This is often accomplished by the prefixes of “Always…” or “Never…” or some similar authoritarian language. By the way, this list is no exception, which is why I worded the 1st title as I did. The items here are meant to create awareness and begin dialogues, not provide exact rules in advanced UX design.

2. Check your facts.

I see so many lists with factual sounding statements (often by using technical jargon) which often don’t even make sense. I would expect anyone reading this blog to be just as skeptical, and you should be.

I read an article about color use recently that popped up in my email by one of those sites that regularly publish articles about different aspects of design. As I began reading, I noticed this string of statements.

Firstly, we need a shared language of color terms and definitions. This handy list is from our quick guide to choosing a color palette.

The vocabulary of color

Hue: what color something is, like blue or red

Chroma: how pure a color is; the lack of white, black or gray added to it

Saturation: the strength or weakness of a color

Value: how light or dark a color is

Tone: created by adding gray to a pure hue

Shade: created by adding black to a pure hue

Tint: created by adding white to a hue

What bothered me was the arbitrary mix of subtractive color models (typically the RYB or CMYK associated with physical pigments and dyes.) and additive color models (typically the RGB model associated with light and digital displays) terminology which shouldn’t be mixed, particularly when its preceded by the line:

Firstly, we need a shared language of color terms and definitions.

This article in itself referenced another article which used a standard I’d never heard of as a painter nor had I read it anywhere “Painters Color Mixing Terminology” implying this was a standard taxonomy. I have heard many of the terms, but never in such a structured and almost arbitrary way. One can simply use photoshop to see that the use of these seemingly defined terms, fall apart when one assumes that they mean something concrete with clear and consistent results. Additionally, the terms subtractive colors and additive colors in the function of an everyday designer, which are critical to design work which may appear in print and on the web where each is significant and understanding those differences is essential to the practical and technical aspects of implementing any quality aesthetic design.

Additionally, the referring article also included this quote as important information:

What colors mean

Red: energy, power, passion

Orange: joy, enthusiasm, creativity

Yellow: happiness, intellect, energy

Green: ambition, growth, freshness, safety

Blue: tranquility, confidence, intelligence

Purple: luxury, ambition, creativity

Black: power, elegance, mystery

White: cleanliness, purity, perfection

Any good UX designer would throw these out the window. What any color means changes wildly depending on what the context is, what the contemporary zeitgeist is, or the culture of the demographic that it is aimed at. For example, culturally, in the US red often indicates warning or danger, whereas in Japan it indicates happiness. But still, these are dependent to greater or lesser degrees on context. Not every international appearing design is as internationally accepted as the frequency of its use.

These examples are just based on one article about color alone. Other examples are myriad.

While I see and peruse a lot of UX sites, I have very few sites I read with regularity. These are the ones I trust. I’m not mentioning them because you as a reader and (I assume some of you are UX designers) should be selecting sites with some real critical thinking. I have a library of real books I can peruse as well as a wealth of online peer-reviewed literature. But I always let a healthy amount of skepticism be my guide. Even Wikipedia can be good for some foundational material as long as you check resources that may seem to be too good to be true to make sure you’re not unconsciously p-hacking yourself. Always try to prove yourself wrong, not right. PRoving yourself right is far easier and more difficult to refute misleading results due to ego.

Steve Allen, the late comic, composer, and writer, wrote in his book “Dumbth!” that lack of critical thinking, particularly in the case of taking experts assessments, reviews, and endorsements as fact. This lack of discernment is a common weak link in the chain of thought where rationalizing or biasing the opinion into fact has become a problem.

I welcome different perspectives and I’m sure that the statements I’ve made here are equally subject to them.

3. UX Design is an empirical science, psychology, and art.

I am almost pummeled with job offers for UX designers where rarely is the science of UX mentioned in anything more than a vague or oversimplified reference to user testing. Sometimes they will add more of the traditional jargon like wireframes (so often misused or misrepresented) and prototyping. What is rarely mentioned and is a red flag is that any of these UX science aspects of design are rarely indicated with any implication of support (financial or personnel). It’s as if all these tasks can easily and just as effectively be replaced by heuristics (often mine alone) and the use of “common sense” which is often the foundation in UX research for “false-consensus effect“. This is often poorly used as a cost-saving and time-saving measure. I can use my experience and good heuristic testing practices as a triage for user testing, but it is a poor substitute.

If there is a lot of language in there regarding the importance/significance of creating visual design assets, design guides, or other more visual aspects of design, then I’ll usually pass or ask if they are really looking for a visual designer with some UX background – which is basically a visual designer.

Conversely, if they require a lot of coding knowledge, front-end development, CSS, javascript, etc., then I’ll usually pass or ask if they are really looking for a developer.

The reason isn’t that I can’t do visual design (it was my original study in college) or that I can’t code (I actually went back to college to learn programming) but its because I want to design UX architecture. I don’t want to be a visual designer or software engineer. I like that I have enough knowledge and experience in these areas to be able to address UX in a way that I know can be implemented and if someone says “how can that be implemented?”, I can explain, in detail and with technical, specific references. These two skills are important skills, but when I’m writing code or deigning icon families, I’m not doing UX design.

4. UX Design should use the scientific method.

If you want to do an effective job at testing an interaction, test how it fails. Testing for success is both easier to do since it requires a far less rigorous approach, but it is also easier to cheat and bias the results even without malicious intent. I’m not referring to grand p-hacking news stories (although relevant) but rather the more subtle types of formative testing for the simplest of tasks.

I should also point out that the importance of appearance, while highly influenced by emotion and experience, can be quite variable, it is nonetheless measurable and should always be taken into account. This is where the assessment of quantitative results and qualitative results can reveal some surprising information.

5. Visual design is far less important unless your product is a commodity.

Generally, products either have a unique differentiator (something no one else has literally), or, a product’s differentiator could be not unique, but significantly better (faster, more accurate, more complete, simpler, cheaper) than a competitor. In these first two cases, visual design (if it’s not an inherent function of the product) is one of the last things to be considered in its design. This is because a differentiator needs to be both introduced and made to appear intuitive (even if it really isn’t). This is done through UX architecture such as user research, interactive design, formative iterative testing, and any other areas of ideation that may be available depending on the resources (primarily people) available.

Before the visual design is even addressed (again, excluding things that require certain visual requirements like standardized colors for stop/go, start/stop, warning/failure. Even these though should remain as words for as long as possible.

The reason for this is that visual design biases early iterative testing into creating artificial mental models that become improperly reinforced.

But if the product is at the commodity level such as a website for a store of any type, then the approach changes. Here visual design can begin far earlier but concurrently with the interactive design. In the early stages here, branding concepts such as palettes, layout schemas, font and assets standards, and designs can be explored in a componentized and scalable manner so that when the visual elements are then integrated with the interactive elements. This approach works more harmoniously in a collaborative way.

6. Formal training in UX or visual design isn’t just a “nice-to-have”.

As someone who has had formal training (i.e. University, or some nationally accredited by a professional or governing body), been self-taught, and been taught on the job, each has its advantages. One of the greatest advantages of formal training is the importance of the reduction of ego in the design process. This isn’t just so that the designer can handle criticism, but welcome it, learn from it and use it with equal constructive use on colleagues and reports. Early on in my career, I was occasionally suspicious if I didn’t get some criticism on any submission. It usually meant it was viewed in a cursory manner. While sometimes I would be told that my work was of a quality that didn’t require constant supervision (a wonderful thing to hear) it also quickly taught me that I had to improve my own self reviewing process which I still do to this day.

I have learned many things through self-taught study, both informally through training books and online courses and personal experiments, it still lacks the 2nd and 3rd party interactive review throughout the process. Learning on the job can do a far better job at this although, the learning may be less consistent due to meeting your job requirements and deadlines.

I particularly enjoy mentoring whenever I can. I used to say “There are no stupid questions.” But now there is a caveat: “It isn’t stupid until you ask the same question over again.” I realize that someone may not fully understand something without realizing it, but there is also a point where it is up to the learner to ask more follow up questions until they understand why something is being done and not simply that it has to be done.

7. Context is everything.

The context of a user experience sometimes seems so obvious that it needn’t be recorded or enumerated. This is often an error. In formal or complex systems, it’s essential, but in informal or common situations, it should still be addressed, even if informally, so long as it is recorded somewhere in the process. The reason for this is that we often overlook obvious things because they are often automatic in their execution, autonomically prefiltered sensory input such as background noise (audio, visual, or cognitive) and so on.

Two instances where I found this to have been a critical issue that was addressed far too late were:

  1. Designing (actually just updating) the UX for a cars infotainment system. Just looking at the data of mental models for media players on phones and in cars gave a rough idea. Additionally, there were the companies differentiating features, not the least of which was A.I. and machine learning in the media selection process. As a proxy, tablets were used and even in-dash prototypes would be used. All of these pieces of information were very helpful, but a flawed contextual assumption was testing without driving. While driver distraction was addressed in the assumptive paradigms, they were not tested in real-life situations. This required some significant changes to the design recommendations that were counter to the product requirements due to the simplest of use cases.
  2. When designing for an enterprise radiology workflow, I was aware that the majority of enterprise radiologists worked in dark rooms and so this was taken into account in the design paradigm. However, when simply sitting and watching a variety of radiologists with different areas of specialization work in these dark rooms, it became apparent that differentiators in screen data could not only be reduced but had to be like the current versions that these radiologists were using was clearly distracting in a way that affected their attitudes while doing their diagnostic work. While this change was not asked for by users or listed by product management, once implemented, the response was overwhelmingly positive, with no negative responses.

Each of these issues was addressed but later in the development cycle where they required resources and time were far higher than they would have been had these issues been noted properly earlier on. Incidentally, these two specifics were not the only ones but were two that could be fairly briefly described.

8. Flexible and scalable design or quick design.

Most people have heard the old saying “Better, faster, and cheaper: pick two.” This is similar in UX design although simplified into two choices rather than three. You can’t do both and you need to understand that each has its advantages and disadvantages. A design that is flexible and scalable requires much more time at the beginning since a fairly detailed understanding of the user and system ecosystem are necessary to design in this way. This deeper understanding of the system allows a more componentized approach to UX design since you can reduce the frequency and increase the speed of validation in the formative states. Additionally, it can facilitate more scalable and agile engineering development once the designs reach that phase of development. Also, way down the road when you get to more summative testing, adaptation is easier and often allows faster decision making when there are important benchmarks and deadlines. This slower beginning though often requires a somewhat defensive stance since there’s often not a lot to show soon. I should note that flexibility in design and scalability in design, while different, can be fairly easily designed concurrently with little time or resource reduction if only one is addressed since the same resources and many of the same design considerations are used for both.

The quick design approach allows MVP to get to market or at least shareholders and sales far quicker. This approach of being fast and dazzling can be a boon to a startup with no product yet in the market. There is something very compelling about having a working product as opposed to a prototype in hand. It doesn’t show what its supposed to do but rather what it does do. The big drawbacks come when changes or a new version needs to be made. Hardcoding of applications require major rework if not rebuilding the base code from scratch. Additionally, from the purely UX point, fast design, while providing a UX solution for that specific product (even if it’s the first design) is likely to create both expectations and mental models which are incompatible with new or different features and can cause user fatigue, user conversion issues, inconsistent user expectations cause by the earlier mental model, leading to frustration even though the newer/replacement/update product is better in terms of quality, features, and reliability.

Jared Spool wrote an excellent and more expensive article a few years ago on this called Experience Rot.

9. Don’t espouse good practices and then not follow them, or worse punish those who do.

So often there are things I’ve heard over and over in various companies. These are often well known (both because of popular repetition and the truth behind them.) The biggest cause of this problem is either or both fear and ego.

  • Don’t be afraid to fail. I’ve witnessed numerous punishments and more than a few firings due to many genuinely innovative initiatives that fell short even though the reasoning for failure may have been insufficient development time, unknown variables, and even while being successful, being considered a failure against the exact letter of the original hypothesis. In many of these cases, there was significant and usable invention and innovation that often was utilized later. These failures were punished because someone in a decision-making position either felt threatened, or they would cancel a program due to fear of failure rather than the potential for success since job loss and sometimes fragile egos are bigger than accolades and ROI.
  • Experience is essential and then placing hierarchy over experience. In an ideal, in fact, in many regular businesses, good leadership will hire experts to advise and help the company succeed. Often, though, a long relationship with an industry does not always mean a thorough understanding of it. Knowing the politics of an industry, never equates to knowing the technology of it. Both are separate domains of information and like many areas of knowledge, each requires (1. study, (2. practice, (3. many failures and (4. enough successes to address a luck factor, for genuine expertise. I know enough about corporate politics to realize that I want to only be involved when necessary and no more than that. But I also know that someone’s experience in the political aspects of a product or project doesn’t outweigh the technical/production side of it.
  • End meetings without a confirmed and mutually understood consensus. I have been to so many meetings where not only nothing was decided, but virtually everyone left believing that there was a decision and whatever they thought that was is what they are going to be acting on. Even a meeting where nothing is decided or resolved is ok as long as everyone leaves with that understanding. There is plenty of good basic meeting best practices out there. My point is to simply follow them.
  • We have a long-term plan but then you realize that it can change week to week or even day to day based on Knee jerk reactions at the decision-making level, so often that more resources are depleted to spinning up and spinning down rather than to actually producing anything. I of reference this in a paradigm I refer to as the “Sitcom logic approach.” This is where an idea to do something is presented and on its first (and only) run, fails hysterically (it is a sitcom and fixing it wouldn’t be funny.) Of course, what then happens is that the idea is abandoned for something else. No one tries to figure out what went wrong and whether it is fixable. Often these failure points are more likely to be minor missed considerations rather than catastrophic conceptual errors.
  • “No.” and “I don’t know?” are negative only if viewed from an ego standpoint. Dismissing or shutting someone down for these statements is in the first case (“No”), dismissing their experience and knowledge beyond what you may know. The second “I don’t know” dismisses curiosity and an opportunity to learn and innovate.
    Adam Grant has spoken and written about the successfulness of those whose careful use of “No” has improved their productivity, businesses, products, and more. I highly recommend Adams books and videos. So if you follow through on that, I needn’t simply repeat it.
    As for the “I don’t know” in the design world, ego relegates this to lack of intelligence and inexperience when the opposite is generally true. Interestingly, it is the primary door to both knowledge and experience. First, because, when you say it, you are responding to a question or situation to which you have know answer or response. This provides you with exactly the subjects or concepts you need to learn about. While learning about those subjects, you have the opportunity to relate them to your own life experiences, often as they are happening around you. This second part provides the first form of experience, and that is basic observation. The second part of experience come from when you decide to put the newly learned concepts and ideas into practice to see where and how they succeed and fail.
  • “Standards” that are really just “Trends”. There are so many examples of this that I don’t think it would be too difficult to compile a top 100 trends that became standards but still went away as trends do just more slowly because far more money and time was invested in it. I’m just going to use one example: the open office environment. As someone who has been around worked in office building with actual offices to “office” buildings which look like well decorated warehouses with modern desks. I began seeing the first long-term studies almost 20 years ago referencing the actual inefficacy of the environment. Not surprisingly studies continue to reveal the same thing.
    • They are intended to foster collaboration – but they reduce it
    • They are meant to create a more social environment – but they increase the need for privacy beyond what would be normal privacy expectations.
    • They increase distraction
    • They reduce productivity
    • They increase offsite work even when that’s not the desired effect.

The part I find amusing is that most of the adaptations to the problems of the open office environment are symptomatic cures and don’t address the actual problem. Things like privacy rooms, quiet areas or comfort areas, gaming areas.

10. The UX unicorn problem.

I am a UX architect who started out as a graphic designer (because that what we were called back then.) I was an editorial art director in medical publishing, I designed advertising for everything from food courts to couture retail. Then I got a job at Xerox. That began my journey into what was to become UX, through early (and unbelievably expensive computer graphics and animation), years of instructional design and early computer-based training, and so I went back to college to learn programming. This was useful since it taught me two things, 1st: how to design and write code. 2nd: that I didn’t want to be a programmer, but it was incredibly useful to understand what was going on in code and how to engage with software engineers. I then spent time developing programs and learning about the complexities between how people interact with machines, computers and sometimes simply objects. I got work with a lot of testing environments from summative testing in highly controlled labs with eye tracking equipment, to simple formative testing with both quantitative and qualitative results as needed. I did a stint at Gartner designing data visualizations for their top consultants for their world symposia, and designed UX VOIP systems (for regular users to administrators) for what used to be ShoreTel (now part of Mitel). I’ve designed radiology enterprise systems (PACS) and Voice controlled and enabled vehicle infotainment systems.

With all that, I find the Unicorn designation to be problematic. While I can do a lot of things because I’ve had a broad experience, I would rather apply all that experience to creating really elegant and effective UX. This doesn’t mean something that is spectacular, because that’s really more about visual design and if the process for the user is not about the result of what they want to accomplish. It doesn’t mean something that’s really interesting interaction experience since that applies to game design more than common UX. I have often said and will continue to say, if my work is noticed by the user, I have failed. This goes well beyond the rudimentary expression “If the UI needs to be explained the UX has failed.”

Here is where the Socratic quote, “The more you know, the more you realize you don’t know.” Becomes so apparent. This is important here because while the unicorn UXer seems to be able to do so many things. It means also that for all the time that they are doing user research, they are not doing strategic design. For all the time that they are organizing, running, and analyzing user tests, they are not designing wireframes. All the time they are creating visual assets, they are not establishing a design language with its documentation. All the time that they are managing tactical aspects of implementation, is time better spent on establishing the standards of Human Interaction Guidelines. Software development gets distributed and delegated but there is so often an expectation that for no good reason that a UX designer can do everything concurrently.

For all the boasting of the UX being of paramount importance to many companies, so many invest precious few resources to it nor understand its complexity and process. So when I see a company is looking for a UX designer who’s a unicorn, its typically going to be either an underpaid position which will either set them up for scapegoating or burn out the employee in no time. On the other hand, it may be more likely that they are hiring for a position for which they do not really understand the importance of the needs of the user, all the while being certain that they “Just know because it’s common sense.” This dangerously and incorrectly commodifies UX design work, and worse than that, almost forces mediocre work. It removes the ability of the UX designer to design an elegant interaction and forces them into a high-speed triage situation. These situations do happen in the best of circumstances and having a solid grounding in formal training and many years of experience increases the likelihood that the quick solution will be a good one. It is, however, a bad approach to design and development.

In summary

While I’m often surprised at the amazing outputs which were based on the luck of an early idea being successful, I’m far more impressed when the success of the outcome is from diligent well thought out work since this kind of work will far more likely lead to further successful improvements in the future.

Advertisement
Posted in Uncategorized | 1 Comment

The sitcom logic approach to failure.

By Robert R Glaser
UX designer and architect.

This is an issue that seems omnipresent in many businesses save for a miniscule percentage. It is often replaced by a bulldozer approach or worse a decision based on a random guess approach to problem solving.

What do I mean by this? Well its surprisingly easy to describe and will be easy to recognize. Whether you watch old sitcom on TVLand or new ones on a streaming service, the plot device is used commonly in sitcoms is that the one or more characters are presented with a problem (e.g. suddenly needing cash, or meeting someone, or fixing something.) They quickly realize what needs to be done and then devise a method to achieve the results. This method is typically silly and irrelevant. Something happens which quashes their process, so they fail. (Here’s where the sitcom logic comes in) After failure, the concept is discarded in whole without a thorough postmortem to determine where the problem truly lies which is usually the method for the solution which for the sake of the sitcom is usually silly. Although often a review is performed and maybe some superficial details are addressed but not thoroughly enough, so the entire project (including the initial theory, which is usually valid) is discarded.

Jeff Catlin in Forbes talks about the failure of IBM “Watson for Oncology for MD Anderson due to lack of consideration of varying cultural models and have a 62-million-dollar investment halted. Interestingly, I had previously worked at Phillips Healthcare and one of the aspects I had to consider in any [UX] design work was that the resultant design needed to work in multiple markets outside the US. So, for example, something that has little or no value in the US may have significant or even critical value in Germany or the U.K.

Another example is expressed through the issue of innovation and invention without proper commercialization. The story of the Apple visits to Xerox PARC has become mythologized, but in the long run, Xerox and Apple both had technologies which revolutionized the personal computer industry. The difference is that Apple didn’t discard or ignore these technologies, but rather commercialized them (which was significantly surpassed by Microsoft) to the point where they are ubiquitous across the industry of hardware and operating systems and applications.

Don Norman has talked about how it often takes decades for a new technology to be adopted on a massive scale. This adoption isn’t merely waiting for it to be cost-effective but rather that it be viewed as non-threatening in nature (even if it never was) or not foreign in its mental model, even though its actually easier to use or manipulate. He has used the touch screen as an example that took over 30 years to become ubiquitous (from it’s functional invention in 1975 and first production in 1982 to the real large-scale adoption with the iPhone’s release in 2007.)

My own experience has shown this is often true, particularly when some new technology is trendy. It is common for people to present it as a possible solution or methodology that is forward-looking without the background check of (even superficially) of determining whether it’s actually useful or appropriate for the case at hand. Currently, I work on the in-vehicle experience and see what technology can be used to support it. The most frequent error I hear is “Why don’t we do x because I can do that on my smartphone” without the simple consideration of the fact that cell phone use in cars is limited (to varying amounts dependent on individual State Laws, some of which allow you to do almost nothing with a cell phone.) While it seems obvious to some, most people don’t realize that the mental model is significantly different. Using a smartphone by the average smartphone user requires or draws full attention sometimes to the point of being unexpectedly and dangerously undivided. If you have seen the youtube videos of people walking into walls, polls, other objects, and even people while interacting with some social media or games, feel free to look it up.

Decisions that are made need to cover a range of issues and those issues should be graded based on how and what biases drove them. Common food is a good way of demonstrating this. If you poll people across the US to find out their favorite foods and also what foods are the most consumed, you will probably find there is some overlap but that they are not the same. You would also find out that these change over time. There are significant variables like cost, availability, and trends that have a significant effect on these. There are other variables as well and this list changes significantly between regions and even more so if you start considering countries outside of the US. Here, an important issue is brought forth. The more people are added to the averaging process, the less likely you will have a genuinely ‘average person’.

What this means from a UX standpoint is that designing to the average creates a mediocre outcome for most. So you may have an excellent theory for the UX but the data that it is based on drives the method for a solution that takes no individual into account and therefore, often, pleases few users. The difficulty here is in figuring out how to effectively subdivide the user population so that each subdivision has a way of letting the UX be seemingly (as opposed to discrete customization made by a user) customized to them. These subdivisions can be by age, culture (geographical and/or racial and/or religious), economic, sex/gender, and educational. There may be other subdivisions depending on the target audience. Some of these may not be relevant to a specific case, but they should be addressed before they are dismissed. This issue is a common driver of mediocrity or worse, failure.

What often happens here is an example common in application development where the placing all features at the same level overwhelms the user. The mistake that is certain features are eliminated (along with the users who find those important) rather than finding out how to address smaller percentages of users. This can cause a failure of the application to gain a growing or even sustainable user base.

These kinds of issues are common in applications with large feature sets. You would be unlikely to find a user (other than a person certified to teach the use of the application) who used all of the features of a complex system. Complex applications like this often have overlapping user bases which utilize the application for different purposes. Adobe’s Photoshop is a good example of this which can be seen by opening and examining all the menus and submenus available. It has users who are professional illustrators, or photographic cleaner/retouchers, or visual designers for applications development (both software and hardware) in addition to hobbyists, and even people who specialize in creating maps for cgi (3D) work. There are sets of tools for each of these groups which are often never used by other groups but are critical to the work of a specific group. The interface for Photoshop is customizable for optimization of whatever the user’s primary task is. There are also features which overlap several groups and a few features which are used by all groups. When decision makers are either out of touch with the actual users, or worse believe that their own use paradigm is (or should be) applicable to all.

So when, in circumstances such as this, there is a failure, and the solution is discarded then there is often a reconfiguration of the problem under the assumption without review, that the initial problem was wrong when in fact it was the solution that was wrong. For example, there isn’t a simple review to determine whether the problem being solved is actually needed. I may have a revolutionary solution to a problem, but if no one has any interest in solving that problem, then the implementation may be successful but the product fails.

Really innovative companies release products usually with a primary intent for a product and some ancillary solutions as well. Once in the market, the users focus primarily on one or more of the ancillary capabilities and focus minimally, if at all on the primary function. The company then realizes that instead of seeing the product as a failure, simply starts focusing on the secondary functionality as the new primary feature(s.) If they are really driven by the user’s needs, then they will genuinely asses whether the primary function is simply not needed, or was not well implemented. It really takes some fairly rigorous evaluation by the decision makers to see past their individual confirmation biases.

Personally, I learned a long time ago the deep importance of “I am not the user.” This has been really useful when going through user result analysis. Outside of basic heuristic evaluations, I always assume that my preferences are atypical and therefore irrelevant. This way I’m more open to alternative viewpoints and particularly interested in the times when many of those alternative viewpoints are similar. That becomes a simple if unexpected, target. I can then see whether the original problem definition was wrong or the solution was wrong, or maybe both. We do learn from our failures.

Posted in Uncategorized | Leave a comment

Why we should be removing ‘democracy’ from Design Thinking (and maybe Agile/Scrum processes too.)

Design Thinking circle-02

By Bob Glaser, UX designer

Design Thinking has been around for almost half a century. It has been used successfully for many of those years and yet, as it has gained significant momentum in the last decade, it has also been reformulated, varied, simplified, altered and ‘fixed’ by various purveyors. Many of these, for the purpose of repackaging and more importantly, reselling the concept as a training program or consultancy. Because of the breadth of design thinking, I’m assuming that the reader is already aware and likely in use of design thinking. Therefor, I will not go into a detailed description of design thinking.

One (of many) concepts that I have seen as a corrupting influence on outcomes is the input of democratic decision making into the process. Why is this corrupting (bad) to the success of the process. It is because it can have the effect of dismissing the very real positive outcomes of the process.

How?

First, let us consider the process. For the sake of clarity, I’ll choose the Nielsen Norman Group’s descriptor of the process since it addresses it in an strightforward applicable way, rather than in a broadly conceptual way. (There are many other versions out there that are also suitable, including some of the original concepts which were well refined by Stanford School of Design which had simplified the original 7 steps to 5, but some are overly detailed for the purpose of this post, even though they are just as exposed to the democratic corruption.)

That process is simple in its semi-linear circular iterative process:

  1. Empathize
  2. Define
  3. Ideate
  4. Prototype
  5. Test
  6. Implement

The first 2 are the ‘Understand’ phase, 3-4 are the ‘Explore’ phase and finally 5-6 are the ‘Materialize’ phase.

Since the process combines the seemingly paradoxical pairings of logic with imagination, and systemic reasoning with intuition, it is susceptible to being adapted in a way that can defeat the purpose of the processes results through corruption.

When a group begins this process, they consider the user’s needs, the business’ resources/viability, and the technical feasibility/capabilities. They then follow the process and come up with potential solution(s).

The problem arises at this point.

This common error, is taking potential solutions and voting on them. The problem with this approach, is that it tends to cast the base concepts out the window in order to determine a solution. Sometimes the vote is determined by some constraints such as choosing low hanging fruit even though these are low on the priorities because of the fact that they are easiest to deal with. This is often followed by the idea of resource limitations that may be artificially imposed. This may be stated like this “We are only considering the solutions which can be accomplished in [time frame] (or some sort of similar artificially or arbitrary constraints. Then the group votes on solutions based on these constraints.

Since the purpose of this process is to determine the solutions that need to be addressed*, the results are corrupted by a democratic vote which dismisses the effective and hopefully innovative result. The use of intuition and imagination of the solution creation process is being carried into a realm concurrently with logic and empirical decision making. Design thinking is meant to use these empathetic concepts to help frame or reframe the problems and potential solutions with an approach that brings creativity to the process rather than just a methodical scientific method process alone, and thereby produces more innovative solutions. It should be noted that design thinking is simply one of many ways to help produce effective implementable solutions.

The vote may easily (or regularly) concatenate the solutions and therefore eliminate the best, ideal or most effective solutions from the standpoint of the user.

*Design Thinking is a solution perspective as opposed to the problem perspective of the scientific method.

How to deal with this democratic corruption?

This is fairly easy though often not popular because it requires a little extra effort. When the group is in the early stages of gathering information (Understanding phase) they should also be defining the requirements of acceptance. These requirements are what the solutions should be put into to filter the results that will be implemented. If one is determining the requirements of MVP (minimum viable product) then it should be easy to simply say that a solution is effective but not necessary for MVP while another solution is absolutely required for MVP. Then when it comes to the ones that may or may not make it, the same criteria are applied and instead of addressing the egos of the design thinking process participants (in the business/company), the results will address the needs of the users.

This is not a flawless approach, but it helps define requirements for solutions more effectively. If it doesn’t, then that lack of effectiveness becomes a solution issue for the next iterative round of the process.

I should note that this particular issue came to me in sprint planning meetings where what will be accomplished is not based on needs, but rather schedule first, then resources, then needs. In this scenario, “needs” are the first thing that gets dropped because it’s priority is wrongly demoted to last. Design thinking places it first, and if the democratic corruption doesn’t demote it, then it remains in the forefront where it should be.

I should also note that processes that are not user oriented (directly) can still be effectively addressed by design thinking by considering the indirect effects on people, of the process(es) being addressed.}

Posted in Agile, Design Thinking, MVP, Scrum, UX Strategy | Leave a comment

Correctly Dealing with 5% Use-case

I have noticed a common myopic view of the handling of edge cases around +-5% use-cases features (those features that are addressed by only 5%, give or take of users. These can be outlier, expert users, or special situation users (by job, environment, age, or other demographic.) I should note that the 5% is an quasi-arbitrary small number. It is meant to represent a portion of users that isn’t so small as to be outside of the MVP population, nor large enough to automatically consider it as a primary use case. It will and should vary depending on the size of the user base and the complexity of the application.

The problem is that this group of users is often either not parsed properly, or not defined as cumulative grouping. These exceptions tend to be handled exceptionally well in some highly complex professional applications (in terms of being highly loaded with specialty features) such as Photoshop or some Enterprise Medical Imaging software as a few good examples. Other than these cases, these are used improperly by many UX designers or company defined design process which are often, though not always, outdated.

Concept, execution or explanation.

I’ve seen many concepts and projects fail, not because they weren’t good, useful and saleable products. The problem was because the product was mark as a failure because a lack of understanding as to either the problem that it solved, or the benefit it provided.

The solutions can be:

  • A simple visual that easily displays a complex interaction in a simple manner of literally showing the difference in real-time and real-life manner.
  • Or it may be with a simple overall description that encompasses a sometime incomprehensible number of features or even the features are not the focus but rather the ‘simple’ integration is.
  • Dealing with an ineffective or even inappropriate (business-wise) choice of data that is being used as the comparative sample that fails to present the benefit of a concept.

The first example often happens when dealing with an audience that may not be able visualize the solution being described. This inability to visualize opens the door to all kinds of cognitive biases. For example, in a fairly necessarily complex UI I was working on, I had suggested a simple fine (2 pixel line) around an active study (in a radiology environment.) This description was dismissed and then a myriad of grotesque solutions were proposed. These were too severe and problematic to consider for implementation since most focused on one aspect with considering the complexity of the UI. So, I showed in a simple two page powerpoint how it would appear if a radiologist selected a study. The concept, previous rejected, was unanimously approved (by both sales leaders as well as clinical specialists), simply because the actual images were “real” in terms of how it would look exactly on the screen (with nothing left to the imagination.)

The second example comes from having an application that can do many things through a central integration point. Each of these features has a high level of desirability to overlapping markets. The problem became apparent when questions from the audience would sidetrack the central focus (because it was not clearly defined) and then the presentation devolved into a litany of features (few of which were particularly remarkable on their and others were remarkable but undifferentiated from the less remarkable features.) Here the solution was to present the idea of integration and a central focus point as being the true benefit of all of these features.

The third example is surprisingly common. Here, the functionality is properly and thoroughly presented but the sample data being used is too small or too random to demonstrate effective results, or not ‘real enough’ to be able to correlate with results that demonstrate the power of the functionality. For example, perhaps the functionality is a way to present the use of a home based IoT climate control system using machine learning to learn usage pattern for specific households. If the database being used is not based on real aggregated database of individual home data points and is in fact an artificially generated database based on a real data but randomized for because of privacy or security concerns, then the resultant analytics will be equally randomized and fairly useless, since it would be impossible to show actual use-cases for various demographic (or other) filters. So the resultant displays from the algorithms may be dynamic, but they would show no real consequential and actionable results. This would lead the audience to simply see that this does something but cannot see how it could show me anything useful, whether it was basic information or unexpected patterns of specific groups. This ends up being a lot of effort whose result isn’t much better than simply saying “Believe me, it really works, even though you can’t see it here.”

Also consider:

Another aspect of this 5% user base is that the use-case could be a critical but one time use for 5% of the population, or it could be a regular required use for 5% of the population. While this 5%, regardless of which of these to groups your addressing, could be a different 5% for each of 19 more features/capabilities. In the first case, it can be buried in the preferences, while the later could be buried in the 2nd level (2 clicks away) with the option of custom shortcut implementation.

These may seem obvious, but the require diligence because they are often considered during a specific phase of design and development, when they should be considered all through the design process, from ideation through post production maintenance and version increments as well as postmortems.

Summary:

This is a cursory observation of the problem (meant to initiate the conversation.) There is no one solution to this issue, rather the problem should be considered in advance of the presentation and then a proleptic approach becomes a more effective presentation structure. I personally like to think of it as using scientific method to create the presentation of the concept. Theorize, test, focus on flaws, not positives (assume that the positive is the initial concept that your trying to find flaws in before someone else does, or to simply validate the quality of the concept itself), and fix it if possible.

Posted in Uncategorized | Leave a comment

Correcting perspectives in UX design

sextant

There are several factors that guide the UX that are accepted.

  • Its effectiveness (simplicity, ease, and functionality.)
  • Its lack of obtrusiveness (it gets your attention based on criticality or “on demand” need.)
  • Its implementation of accepted technology vs. new technology within a domain.
  • Its forgiveness of error.

Effectiveness

This is often a catchpoint. The level of simplicity needs to be commensurate with the task at hand. For example: contacting someone vs. performing a diagnostic procedure. The common error here is negative simplification – that is simplifying a complex process to improve numbers of viewers without considering that the process requires many possible branching decisions, each of which may reveal a new set of choices. If a product is a single function tool, then the MVP (Minimum Viable Product) is easy to define. If, however, the product is a set of tools used to complete a generalized task, then we can often (not always) infer that the completion of the task may require a constantly changing set of tools due to unknown variables. In the later case there are some tools which will be used all the time and others that will be used less frequently but it is important that the less frequently used tools are ALWAYS available because their need/availability cannot be determined at the beginning of the process.

Part of this issue is the determination of the importance of the task and its related processes. For example, in surgery, most processes are critical even if no unexpected errors or situations are presented. A phone call on the other hand could be casual and of minor personal value or one of critical need depending on the situation. Further, a game poses no threats at all, but may anger a user if there are bugs in the process of play. Lastly is the capturing of information. This can be simple like writing or recording and only done for reference or posterity but not required for the presentation of the information which may be meant for listening only. The capture in this case is an indirect reinforcement of hearing/seeing the presentation of information but does not have any actual effect on the outcome of that information. (This, like many concepts could easily be rabbit holed, but I use these ideas for high level differentiation.)

In terms of ease of use, it has to be defined as to whether it should be easy to use. Child-proof safety tops or catches are just such an example of the fact that ease of use should not be applied blindly to everything as they are, by design, intended to limit the users to those who can already understand the reason for use. The same can be applied to professional applications where complex work requires a complex tool set.

Lastly is functionality. There are many complex processes that can be simplified, while there are other complex processes for which simplification reduces the effectiveness because decision points that allow “on-the-fly” adjustments to environmental and other unpredictable variables, when removed can produce flawed, if not catastrophic, results.

Obtrusiveness

This function varies based on use case. Often, without a fairly fully effective AI, there is often no way to determine what should draw the users attention to an attribute of a complex system. There may be regulatory, safety or security requirements that define the minimum parameters for this manner of getting the users attention, but it still doesn’t address when there are multiple points of attention of similar weight/value that are required concurrently. In these cases, it is up to the user to determine which to act on and in what order. Again, unknown variables may affect, necessarily, the user’s process. These variables may be presented in ways that the tool is not designed for. This doesn’t mean that the tool should be altered, as it may already be a highly effective single function tool, but rather it can be left to the user to determine the order based on this assessment of newly or suddenly presented variables. That is why I mentioned that only a fully effective (and mostly non-existent) AI would be required.

If we define the rules by which something should be presented to the user based on empirical use cases and also mitigate the potential issues that may happen if the information is ignored or missed, then it becomes far easier to implement it. It’s just that it’s not that common that those use cases will safely cover errors that could be problematic.

Then there is the issue of what method is used. Here, we should keep in mind that new technology is far more quickly accepted by the product development community than the world at large. This has to do with issues of confidence (will it work right?), trust (do I want to share this information?) and technological maturity (can I afford it? Or is it too cumbersome?)

Consider the concept of the future in the 1950’s with the idea of the TV-picture phone. It was perceived as a marvel of new technology, but what no one thought about was that people didn’t want to be seen at home in their underwear when they answered a phone in the early morning. It was decades before skype and facetime were used with some regularity, and even then only when people were prepared to use it. It’s still mostly used by people making long distance calls ‘back home’ perhaps to another country, or in long distance business interviews and conferences. Even now, if I think of the last three companies I have worked at, I have often seen content being shared but only extremely rarely seen live streams of video of people in these conferences. There is a level of privacy that people still hold onto across the globe when it comes to what and how much they wish to share in a communique.

There are other similar issues with new technology that are foreign to many users and also for which there is no standard. Even gestural touch interfaces don’t have a consistent standard yet even though they became widely available almost a decade ago. Even if there are cultural pseudo-standards in place, they are often context specific. “Swipe right” has different connotations depending on the context which it is used. Even the order of digits on a phone keypad and calculator keypad are not harmonized (a dialpad has the “1” in the upper left corner while the calculator has “7” in the upper left corner; this is congruent with common mental models of data chunking.)

Accepted vs. New technology.

The touch screen has been around for half a century but not widely accepted until the last decade and even that wasn’t instantaneous particularly, as mentioned above, the lack of any standardization (other than implication) of gestural use.

While technologies like VR have great possibilities, there is still the issues of acceptance, standardization of use and issues like motion sickness that have not yet been dealt with effectively.

Additionally, there is often a mistake in perception of any area of growth that discounts leveling off or even drop off from either saturation of the market, replacement by another different technology trend, even if less effective or simply limitations of a technology when it reaches the point of diminishing returns.

Since I live in Silicon Valley, there is often this bubble effect of people seeing technology all around them and assuming that it is ubiquitous when in fact it may only be ‘ubiquitous’ in high technology and/or areas of high median income. As soon as these inhabitants step into a more common area outside, they realize that the very technology they may depend on is not only not available but may also be viewed with suspicion. Consider the rise and fall of the Google Glass. While the technology was amazing to those early adopters, they hadn’t considered that many others saw it as an invasion of their privacy. It wasn’t uncommon to hear a conversation between someone wearing the Google Glass and another, where the other person would say “are you recording me?” and then not really believing whether they were or not regardless of what the Google Glass wearer said. This is not to say that it was useless, but rather that it would be more effective only in specific situations but not acceptable in many others.

Other types of feedback systems from haptic to neurological implants have promise but are still far to nascent to expect wide acceptance.

Error forgiveness.

This goes far beyond the system error of the past. Here is an area of constant annoyance. Consider the fact that there are whole internet sites devoted to posting the sometimes hilarious/embarrassing mistakes of autocorrect. This idea of “I like it when it works.” is a common cry amongst texting pairs who haven’t turned it off. As it stands currently it can speed up the communication but it can also lead to rather severe errors.

While basic machine learning algorithms can address this, it would take a deep learning algorithm to learn the cadence and style of an individual’s communication style including things like context, intent (sincerity vs, sarcasm), interests, vocabulary level, etc. along with the context of the person your conversing with since the language between a parent and child and two intimate partners may be extremely different even though two of those people could be the same person. This makes for complex interactions that can’t be ignored.

 One final note:

Most of my posts are directed at more advanced areas of UX design. It is for this reason that there are not a lot of pictures as samples. I point out examples within my post as anyone beyond the beginner (and any critical thinking beginner) will understand. Additionally, I find superfluous imagery tends to belong more with “top ten” lists and other basic concepts in design. I will always use imagery when it simplifies or clarifies a particularly different, new or complex concept. Imagery can also be limiting to the conversation as any advanced designer will already have a good imagination at visualizing how a concept fits their milieu of design work.

 

Posted in Uncategorized | Leave a comment

Honesty in UX

By Bob Glaser, UX Architectelephant-in-the-room

One of the great inefficiencies in UX design comes from the various forms of lack of honesty. This happens in both individual design and collaborative design. I chose that wording because dishonesty implies intent while “lack of honesty” includes neglect, cognitive biases, etc that along with intent. While empathy with the user is an essential component. Rigorous and sometimes brutal honesty is essential to good UX design.

If you can accept kudos for successes, then you must accept blame for failures. Failures generally teach us more than successes. “Best” is a comparative term with no indication on the scale of total failure (0 if you will) and perfection (~ infinity if you will) based only on what currently exists. Even then, only if all previous incarnations of a concept rate at. for example, 5 and you’ve met the 10 mark then you have improved the concept by 100%, but, you have no way of knowing if perfection is 15 or 150000). This allows us to easily stagnate on the laurels of success.

Failures, however, are concrete or finite. Through rigorous honesty, we can and always should find the root causes. There’s almost always more than one cause, so you shouldn’t stop failure investigation once one answer is determined. Here is a good place to implement the “5 whys approach” as a start. The five why’s are well described in many Lean processes so I’ll not repeat them here.

It’s perfectly acceptable to make myself unpopular in meetings when, after presenting a solution to a user’s problem as being successful in user testing, hearing an internal (within the company*) comment of “I don’t like it.”, “It’s ugly.”, “It’s too plain.”, “It’s not exciting.”, etc., the response ought to be “Thank you for your opinion but it is not relevant here. You are not the user and neither am I.” The aggregated feedback from user testing is factual. I am quite aware that both formative and summative user testing may, by the necessity of the product design and use, require user opinion but this opinion is part of the aggregate scoring and should be both consistent in its testing application, non-leading in it’s style, and evenly distributed for accurate representation in the aggregate totals. The comments made are always taken into account though, because they may point out an area of potential improvement. Here is where we appropriately balance the objective results with the subjective impressions.

Another place where honesty is needed, is in the “big picture” integration of many features in a complex system. An example may be an enterprise system with a primary user group, secondary user group and tertiary user groups, (and so on) each with varying needs and perhaps UIs from the same system. Often, particularly in Agile development environments, individual features are addressed in an unintended silo approach that places “common expectation” or “intuitiveness of a single feature/function” over a common UX design in both integration and unification. This approach averages rather than optimizes the UX and UI. This is the enterprise system product that has multiple functions and multiple types of users mentioned above where hierarchy of users may not correlate to an expected hierarchy in user numbers (e.g. the mistaken primary user focus may only be 10% of the total user base.) This is not the fault of Agile method but it the Agile process allows this to be easily ignored or glossed over. (We must remember that the Agile process was developed as a software development process, without UX as part of its initial design and there are many good articles out there on methods for the incorporation of UX into the Agile process.) This may seem counter-intuitive to design, but what it does is; help to reinforce a common mental model of a complex system.

Next is honesty in priority of function. I have often experienced great effort (and financially disproportionate) in infrequently used or needed features. I think of this as the “pet project syndrome”. Another cause of this is insufficiency or even failure to clearly define the priorities with weights (based on user needs) of features in a form that is rigid enough to create a reasonable goal. The deficit in this area of honesty deprivation is the lack of focus on the primary functionality. This is also one of my favorite areas where the “bucket of rationalizations” is brought out to justify poor decisions in the design process. Here is fertile ground for false dichotomies, and false equivalencies. Often fast decision making masks these mistakes and makes them difficult to see until it’s too late. This is often a result of numerous directional changes within the development cycles and heuristic iterative processes prior to user testing.

Another area is democracy in design. This is a practice that I feel should be abolished after the first heuristic phase of formative evaluation. Then the only time this kind of voting should be applied is with a group of well targeted users who have just tested the product or prototype. Votes taken in a board room are not only of little value, they can be counterproductive and costly. Even in heuristic evaluations, these can be problematic since equal weight is given to a UX designer, feature creator, feature implementer/developer(s), system architect, technical product owner and marketing product owner. Each of these people have an agenda that may be rationalized as user-centric when underneath there may be other reasons (conscious or unconscious). I include the UX designer as also being potentially influenced here. Basically it comes down to the simple fact that the further you get from the user, the more likely you are to get decisions based on non-user relevant concepts. It is easy to fall into the trap that “these decisions affect the user in the long run” is a rationalization for business cut backs based on time or resources and the true effect that it has on the user may be irrelevant or even counter productive. It is not to say that these decisions are to be dismissed, as they may have significant business relevance, but UX should not be included in this unless there is a measurable and direct 1:1 relationship.

Any good designer knows that it is not their “great taste and discernment” that makes them a great designer but rather the ability to create something that they may find personally “ugly in concept or aesthetic or even at the cognitive level” but realize that it is ideal for the end user. If you want to create art, then become an artist where your ego is an asset rather than a liability

Another is the top 3, 5, 10 lists. This not only smacks of amateurism but also ignores the fact that the number is irrelevant when it comes to any MVP (minimum viable product). The features list for an MVP should be only changed when a serious deficit or redundancy is discovered. Not based on anyone’s personal whims (though these whims are typically presented as essential and often with circular logic or specious arguments or examples that are not properly weighted.) I have personally turned down offers to write articles base on these “top ten things…” since any good professional will know them already. They are useful for the beginner but have the dangerous flaw of being viewed as intractable rules.

To me, my best work is invisible. My favorite analogy for this is the phone. When the user wants to call their mother, their goal is to be speaking to their mother. Not a fun dialing experience, not a beautiful ‘dial/send/connect button. Just to talk to their mother. So the practical and physical tasks needed to accomplish this should be seamless and so intuitive and obvious that the user may not even be aware that they are doing it. The challenge her is in getting the user used to doing something that is new to them, different, or requiring trust that a common use case addresses with one or more extra steps. A common example of this is the elimination of the “save” function in Apples iOS. there were plenty of people who didn’t trust it or would constantly check for it until they trusted that their input was saved automatically. The caveat being the “Save As” function.

I should point out here that while I’m a believer that facts rule over opinion most of the time, I will always concede to the fact that our end users are human. There is much more than logic and statistics involved here. Culture, education levels/intellect, common mental models of the user base, and other psychological factors have an important place in UX design as well as limitations that may be set by safety, regulatory, or even budget. The important thing is to make sure that honesty is not pushed to the sidelines because of these additional variables but rather it is viewed as an important way of also dealing with them.

* these examples are based on my experiences at over 13 companies (every company I’ve ever worked for so it isn’t an indictment of any one company but rather a common systemic problem.) as well as examples given directly to me by many other great designers like Don Norman and others.

Posted in Uncategorized | Leave a comment

The disparity of eye vector orientation and proprioception demonstrated with the Oculus VR.

1 of 4

1 of 4

Recently I after playing with my Oculus VR with some games and environments with a Galazy S7 Edge, I had spoken with some other users, several of whom complained of becoming nauseous after some use with it. Unlike about  25 to 33% of the population (depending on which statistical data you use to compare), I am not prone to motion sickness so I hadn’t experienced this and so I questioned those users and found they had all been prone to motion sickness to some extent. I theorized that it probably had to do with the fact that the motion sensory system had to do with head position and not eye direction. This is a major factor in common instances of motion sickness.

For example, someone prone to motion sickness may be able to easily drive a car on a windy road with no effect, but if they are a passenger instead of the driver, they are more likely to be looking in direction other than the direct forward direction (e.g. slightly to the left when turning left or slightly to the right when tuning right.) in other words maintaining a view on the vector of movement and slightly ahead of the current position. As soon as the individual separates the direction of view from this vector, any mild disorientation is likely to initiate the motion sickness effect.

The same is true when wearing an Oculus VR headset. The Fact that there is no eye-tracking leads the user to a disparity of directional viewing vector and head orientation which will cause motion sickness in those prone to it.

This is noticeable when using a game that uses the orientation of the head to create a point in the virtual space “in front” of the user. The point only moves when you move your head, but when you move your eyes, iit doesn’t move. This creates an interesting paradigm of disparity between a seemingly immersive virtual environment and the way the brain processes visual information using both proprioception (primarily of the head) and visual vectors of orientation. When these are disconnected as in a virtual environment, then there is a blank area of perception that is most easily recognized by those who are prone to motion sickness.

Two issues then present themselves which can be taken as potential solution opportunities:

1.       Could this type of virtual reality be used in a therapeutic sense to see if there is a way of reducing motion sickness through development of a training in an environment that already separate these two elements.

2.       This is also an opportunity to add eye-tracking hardware and associated software to account for this disparity and create a more effective virtual environment.

 I will post more regarding this after more experimentation. There are different issues with the tactile UI which I will address separately, If you have any questions, just ask.

Posted in UI Function, UX Strategy, VR | Tagged , , , | Leave a comment

Culturally Agnostic UX.

Designing a culturally agnostic UX.

By Bob Glaser, UX Designer ©2014
People generic silhouettes   When designing the UX for a MVP (minimum viable product) one of the questions you need to have on your “What I need to know list” what is the initial audience demographic and what is the longer term demographic. For reasons of obvious practicality of business planning, these need to be two separate questions and should have two different answers. If the answers aren’t different, then you might as well be throwing darts at board to determining a marketing strategy. I am, of course, oversimplifying somewhat, but not too much.

The reason for these questions is that the fundamental UX structure should be culturally agnostic. There may seem to be an exception when both the user and task feedback of the UI are highly restricted to a specific and typically advanced content/skill set. (e.g. neurosurgeons.) The issue with that it that it still leaves out language as an attribute.

I don’t want to promote the idea that the UX itself is culturally agnostic because that would produce an experience that is useful to few if any. Also, if a product culturally driven and not applicable to anyone outside the target demographic, then it cultural agnosticism is far less important, but shouldn’t be dismissed completely for issues of innovation that may be repurposed later. (I‘ll address this later.) Often these types of applications or products are meant to address an issue that is specific to both a specific demographic that is also geographically specific as well.

Part of this issue also revolves around the common issue of assumption (that I’ve addressed previously.) Often when we are designing UX and IxD (not to mention content and platforms) we have biases that assign the attribute of “common knowledge” to, or worse “Common Sense”. (I could quote any one of myriad quotes about Common Sense but you can just click on the link to see for yourself.) Now in the fairness of full disclosure, this article which I write in my native US English, and the link just provided also being in English, is not culturally agnostic. Such is the limitation of my writing, but I can hope that someone who is fluent in both English and any other language who see’s it useful, is welcome to translate it. If I make an assumption based on my perspective of American culture here in Silicon Valley, you are free to ask me to clarify it or suggest wording that is more encompassing.

Transitioning to a culturally agnostic process.

Do it incrementally.

It is important to realize that it is unrealistic to expect to switch to an agnostic approach because biases can’t be ‘turned off’ cleanly or suddenly outside of a theoretical environment. Humans are affected by emotion no matter how scientific and pragmatic they may be.

The first thing that you want to do is to add an agnostic filtering step to your UX/IxD development cycle. Initially, this should focus solely on cultural biases that are presumed in the design and architecture. For example, if you have access to employees who spent a significant portion of their lives in a cultural situation that is different than yours, let them review it with the idea that they should focus on anything that you presumed in the design.

Example, you could be gathering inaccurate data by providing a question that looks like this.
I am a:
□ type A
□ type B
□ Decline to answer.

This creates a surprisingly inaccurate response. The reason is the presumption that A and B encompass everyone and that the third choice is taken literally. When the user defines themselves as neither A or B and the generic but all encompassing ‘other’ is not an option, then none of the answers is relevant. They are forced to choose an answer that is inaccurate an in a way that you can’t asses when collecting data. I know from my own personal experience in these types of questions, that I usually am somewhat angered by the fact that I don’t even have the choice of ‘other’. Having to choose ‘Decline to answer’ clearly sounds like I don’t want to tell you or let you know, when in fact, that is opposite of what I’m thinking but there’s no option to express that.

These types of questions can anger the user because they can address partnership status, sex, race, religion, nationality, even accessibility descriptions. In the US diversity questions that an employer is required to ask is a good example of this, but the employer has no control of this as it is a federal requirement. The arbitrary clumping of groups isn’t a bother to anyone who perfectly fits the available choices, but the remainder of the population has to choose between the ‘closest’ but inaccurate designation in some arbitrary way or choose “Decline” with its potentially variable inaccurate inferences.

Now, outside of government mandated questions, the UX designer can focus on those areas that they have control. There may even be the option of diffusing the government requirements by distancing the relevant questions from the government mandated questions to both improve accuracy as well as compliance and accuracy. The options here are numerous and beyond the scope of this article.

You can see that the process step is well integrated when you’ve gone through at least two release cycles and all of the stakeholders can see empirical results that have been influenced by this approach. After you have integrated this step into the process (and expect this to take time) and all the teams are acclimated to it, you can evolve to the next step:

Incorporating cultural agnosticism into the complete process.

Once team members of the step of checking for cultural agnosticism into the process, it is then time to view it as checkpoint line item that appears in every iteration in the development process. The big return here is that other groups can see how this approach can be applied to other areas outside of UX and IxD such as product management, marketing, QA, even engineering and R&D and sales.

It is really important that the prior incremental approach is full accepted or else you’re likely to hit a wall with this. The full adoption of the incremental model will create evangelists for the broader implementation that will make adoption easier and more obvious without UX/IxD being a lone grandstander looking for validation and attention.

Try to include as culturally diverse group as possible and if you can’t get that many in person, you can always go online and ask. In cases like this, always go to the source rather than what you may consider close enough.

For example a Chinese perspective should never be generalized as the Asian Perspective. It is the Chinese Perspective. Even in that there may differences such as Mandarin vs Cantonese perspective.

To make the point using the different taxonomic perspective of gender. There is more than the sex perspective of the subject. There is gender, gender identity, and sexual preference. If any of these are relevant to your UX design, then you need granular differentiation since they are not interchangeable in any way. Clumping them together is likely to give you inaccurate feedback at best and at worst, will anger the subject.

80/20 rule in cultural agnosticism.

As Don Norman says, “At some point you have to stop and release the product.”[1] I realize that the previous section on complete integration can set up for a level of granularity that could create an endless cycle of iteration and scope creep that can have a negative effect on schedule and budget. Here is where managing implementation by narrowing focus for the “Minimum” aspect of MVP. The important aspect to remember is that you paint yourself into a cultural corner that forces you to reinvent the UX Design with each new version, particularly when that new version is meant to have a primary focus on expanding the market of the product.

In the end, when you design the UX with a culturally agnostic approach, you will have a foundational design that becomes portable across cultures through easier and more effective localization.

[1] UX Hacking: An Evening with Don Norman 17TH DEC 2013 AT STANFORD GSB

Posted in Globalization Localization, MVP, UX Design, UX Strategy | Tagged , , , , , | Leave a comment

Show me your UX work.

Show me your UX work – The portfolio and skill assessment in UX.

By Bob Glaser ©2014

The UX Process

From whiteboard to delivery.

I am occasionally annoyed by the often misleading concept of the UX portfolio. The very concept of it is often paradoxical. Showing completed work doesn’t address the myriad problems and directional changes and all the complex dependencies that drove solutions that were used to achieve the result. When those problems and associated solutions are presented, it is often overwhelming. What’s worse, is that no one has time or wants to read a lot about the problems. The paradox is further strengthened by the request for demonstrating simple solutions to complex problems. By their very nature and designation, a complex problem is ostensibly complex. You cannot understand the rationale and elegance of the simple solution if you don’t understand the complexity of the problem, and not merely that the problem was complex, but how and in what ways.

How often have you been in development meetings or brainstorming sessions and had each solution knocked down by one countering practical reality after another, not to mention personal agendas. These practical realities may not be known until you suggest solutions. This winnowing process is often time consuming but important, because it progressively leads you toward the the better solution given current restrictions. I avoid the ‘best solution’ because I’ve not worked on many problems with unlimited timeframes.

You can pick one specific example to explain that, but often, that may not be the problem to solution process and result that the viewer of the portfolio is looking for. They want a specific match rather than an example approach. If that approach is generalized, then you run the risk of not being specific enough. I’ve had reviewers look at a project and say, “Well that’s not a particularly complex UI.” To which, I will respond that “The UX/IxD/UI specification document is over 700 pages long.” So we’ve quickly gone from “too simple” to “too much information”.

If samples of work appear unremarkable, that may be the point of good UX. Most UI’s though not all) are not dazzling spectacles of design. They also should not be. They should be making the task simple, easy and relatively intuitive to the user. When someone has just purchased a new big screen TV, then the chances are pretty high that viewing content is their goal. It is not about have a great memorable experience in turning it on or off or changing channels or inputs, etc. That kind of emphasis may sell a TV, but it’s going to be very annoying after it’s sold. This is why demo’s and demo modes on appliances and software should be handled as separate items from UX. It is not that they should not work together, but rather that they have different goals which are likely to be contrary in many aspects.

The process of UX design is the same whether you’re designing a child’s toy or an application for Enterprise wide data analytics. The more knowledge I have in the proposed genre (whether children’s toys or enterprise software, the more likely I am to have biases.) Overcoming those biases can be as time consuming if not more so than learning from scratch from the observation of users. The reason is that the skill set of the UX designer is in knowing how to gather this data, ask questions, develop a strategy and design a process based on best UX design practices, and be more open to considering alternative approaches that were previously discounted by rote because of no longer valid or understood reasons.

Over-reliance on visual imagery vs. function in UX

Because of this lack of time to read about the problems, the visuals that are presented are typically only the visual aspects of UX and the visual solutions that were used to solve them. The other aspects that are not so easily described in a visual manner, are the nonvisual aspects of the user experience and the intangible problems, processes and solutions used to address and fix those problems. If you start to explain issues that have to do with cognitive load, cultural acceptance, mixed but specific demographics, user data, affordances and so on, there is difficulty in simply explaining this in a portfolio. The very existence of UX designers is often about “simplifying the complex” but in order to understand this, you must explain what the complexities where that you were simplifying. Typically the UX designer simplifies the presentation of complexity without actually eliminating it. This is necessary in any complex application. There is often a tremendous amount of thought, research, trial and error that goes into wireframing that has nothing to do with the look and feel of the end product. At the wireframing stage, these problems need to be solved before moving on. This is not to say that the visual design is unimportant or even that there no visual design consideration during the wireframing process, but rather that wireframing includes a lot more attributes than the just the visual design structure. Also wireframing is meant to help identify structure, interaction, and flow prior to the creation of a visual language in order to prevent a lot of unnecessary design and asset rework.

As someone who started out as a visual designer working in medical publishing, I often saw an attitude between the writers and graphic designers that each thought that their output was the more critical information in an article. Collaboration was something that happened by force and necessity and each thought the others field was something that anyone can do. I thought, along with a few of the writers, that we were interdependent on each other, but this was not the common view. I know personally that often professional graphic designers (as we were called then) were viewed as having some mysterious internal gift for creating art. Not that there was a massive amount of formal training, technical knowledge, and the understanding that what we created wasn’t something that we liked because of some inherent artistic gift but rather a carefully crafted design that was appropriate for the purpose and the audience for which it was intended, whether we liked it or not. I also know, from personal experience, that I’ve created some truly hideous designs that were extremely successful, because they were properly designed in that manner. Any inherent aesthetic ability I had told me that some of these designs were hideous. However, it was my training and experience and knowledge of the audience that informed the best design for these specific products was not aimed at me.

This desire to dazzle and distract the user rather than really do what the user wants (which, all too often, to the internal team, seems boring) often stems from internal political pressures. The user will often say, “I find this difficult to use or understand” To which the response is some form of either “Yes but isn’t it pretty.” or worse, (whether intentionally or not) incorrectly reinterpreting the users reaction to one of boredom rather than annoyance or confusion. This is reinforced even more by the “I know what I want and I know everyone else wants it too.” type of bias.

Flash vs. substance

There are 3 aspects of this. First, the ‘kitchen sink’ feature approach, and the second is a new feature approach, particularly when the new feature is considered more important than the needed feature and third, the ‘fix-it’ or triage approach. Surprisingly often, the emphasis on flash blinds the designer and often everyone on the production side of a product from executives, to marketing, to product development to engineering to quality assurance, to implementer/deployment (e.g. sales, integrators, VARs etc.)

1 The kitchen sink approach

This is often more of a marketing requirement rather than a user need. This is a way of selling a product based on an expansive set of features to accommodate anyone. Unfortunately there is no one “anyone” individual. The most common individual user will probably rely on a few features at best. Depending on their environment, habits, and expectations, those few features can vary significantly from one user (and associated use cases) to the next. In each of these use cases, most of the other features are either background noise or may unnecessary cognitive load, annoyance or confusion. Here the UX designer has to address the issue by simplifying the UI to accommodate the different use cases. The methods used vary depending on size of user base, resources (time, money, etc.) and actual capabilities existing that allow the desired outcome to be implemented. Here’s where there can be a huge difference between first impression and second impression. On an enterprise level, I would call it the primary impression (decision maker/purchaser of the product) and secondary and on impression (the user who may have little choice in the purchasing decision.) The UX designer must manage these second(ary) impressions since the focus is on using and reusing the product and less on selling the product. This also means that the UX designer should make sure that business goals of Marketing and UX are aligned.

2 The new feature approach

This is also focused more on selling than using the product. In this case, the new feature may have been added based on user requests, new available technology or some other innovation. In any case, the adoption of the feature may or may not be as expected. If the new feature is based on anything other than user request, then there should be an expectation of slow adoption, even if the feature works exceptionally well. New features are competitive market objects which are often used as differentiators or to follow new trends in predicted (but not tested) user expectations.

3 UX as a ‘fix-it’ step in the process

When UX is introduced late in the development cycle, the work that is done by UX designers is more about triage than design. It’s too late and far too costly to redesign something properly, so while the intent is to make a good UX design, the end result is making a bad design, “less bad.” This also tends to put the UX designer in a defensive mode rather than in a productive one. This is because, their failure to completely fix a problem which was caused by implementing UX too late in the process, reinforces the bias that it is an ineffective skill set of knowledge and expertise. Sometimes we [UX designers] are lucky and are presented with problems that are easily solved. This is when, for example, myopic product design vision fails to see obvious simple usability problems that are easily fixed. It is the responsibility of the UX designer to indicate at cycle wrap up, that what was addressed was the “low hanging fruit” of the product and that substantial changes may still need to be made on the next round. What happens after that depends on internal politics. Dancing around the individual and group egos is just something that you have to learn in business. That is a different subject that’s beyond the scope of this article.

Internal resistance to change

Interestingly, in small but growing companies, every time they add a new department (like research, design, new technology etc., and which is often initially a single person) to the production mix (and I’m assuming for this argument that the department/person is knowledgeable and experienced in their field), there is going to be a moment when that new department is likely to say “The emperor has no clothes.” A well-run company will say “Then tell us what we need to do.” Often, though, particularly as you start addressing progressively larger companies the new department may be dismissed as not understanding what has been done so far. Even though the new department’s expertise didn’t exist previously, there is still resistance to this change. Often, the skills of the new department are diminished by considering them obvious or a commodity or something that doesn’t require formal training or specialized skills and tools. This is normal human behavior but bad for business. Nevertheless, the new department will often have to earn respect rather than being given it initially. This is not very productive, and the department can end up failing in the process. Many seasoned UX designers know that more time and effort is spent on defending UX improvements than is spent on presenting UX improvements.

Originality

“How much of what I’m looking at did you do?” This question varies as much in meaning from person to person who is asking the question as does to the person who answers it.

Consider the following:

If addressing a painter (art): did you create this image entirely from your imagination (which is still derivative), Did you gesso the canvas?, Did you stretch the canvas yourself? Did you weave the linen? Did you grow the flax? Did you make the canvas frame? Did you cut the wood for the frame? Did you grow the tree that the wood came from? And so on with the pigments and brushes and any other tool used in the painting creation.

This example above shows that there may have been many people involved with the creation and production of a painting. While the example may seem ridiculous to many, it strikes close to home when dealing with a UX Designer. While reviewing my own portfolio, I can can point out myriad attributes that someone else or some other group was responsible for even though I may have had intense input and responsible for designing many of the assets or other more trivial aspects of UX design and production. These are irrelevant to me.

I’m more interested in the ability to use tools and resources, rather than making sure everything is original. Not everything needs to be a new invention. The great majority of the time, good UX design comes far more often from simplifying the UX than it does from coming up with and ‘innovative new concept’. Innovation is typically (though not always) more cost effective than invention is so you don’t have to waste resources “reinventing the wheel.” To that end, if you are replacing the wheel with something new, is it truly better or just ‘new’.

Consider the iPod. Here was a product that had virtually no invention (in terms of UI/UX) but rather innovated the world of UX in terms of creating an aspirational experience that superseded all of the prior mp3 players (and there were hundreds on the market when the iPod came out. There was little invention in this product, but rather a very clever marketing of music via the ‘experience of the iPod.’ A great majority of consumers [still] have a perception that Apple was the first to release a personal digital music player.

UX Design Knowledge Domain Knowledge

There is a counter-intuitive requirement that the UX Designer needs to have domain expertise in the product/marketing space. I have found that this concept seems like a no-brainer to the product developer. If you are going to work for a company that produces software for data analytics and solutions or a company that is producing an electronic music instrument for children or produces jewelry and accessories for the fashion conscious woman on a budget, there is this misguided idea that the domain knowledge and experience that is required is essential for the UX Designer. This myopic view causes problems for a UX designer. Prior knowledge also creates biases. Any good UX designer follows a process that includes determining the product features and functionality, research to determine an unbiased definition of the target user (interviews, persona creation, etc.), amongst a number of other process steps. Prior experience is more likely to prejudice the UX designer to draw premature conclusions. Learning the domain from scratch will often reveal more opportunities and expose areas of need in the existing products (if they exist.) This allows for creating more innovative UX because of the fresh outlook rather than innovation for the sake of innovation (which often has a poor ROI.) The more objective I am as a UX designer, the better the results of my work will be. This objectivity also helps reduce becoming emotionally attached to particular concepts and defending them without a solid foundation of logic, data, and testing.

Legal issues of IP

I’m curious as to how many companies want to see samples of your wireframe work, or specification documents, strategic UX planning etc. but don’t think that doing so would be breaching NDA’s or inappropriate (even if unintentional) requests to see the inner workings of another companies products and processes. I know I have to be very cautious of this sort of thing, but it seems oddly hypocritical to disallow you show your work for them and yet they want to see detailed work of others. Additionally, by the time you may be able to publicly display the material, it’s often old enough that it’s age alone will draw more focus than the solutions provided.

Concluding thoughts

I don’t think the requests for a UX portfolio are going to stop, but I believe that until we address the concepts of UX design as far more than merely a visual portfolio, it will be deemed an acceptable practice. Sadly, this means that good visual design may push a good visual designer into a UX role they are unequipped for while the right UX professional sits waiting. This also includes the fact that the visual designer, who may be brilliant in design, might be forced to do work in areas that they don’t want to be doing. The same goes for the coder.

“UX” is often tacked on to job titles without any real understanding of what UX designers do, or with the belief that UX design is a minor adjunct skill. This means that the job description and actual responsibilities relegate less than 20% of work to UX design, while the majority of the work that must be done is coding or visual design, or even marketing.

So if a job description places requirements of ‘expert knowledge’ in areas other than UX and then the UX requirements are looser, such as “Must be familiar with UX/UI best practices.” then the area where expertise is required is what the job is really about. The issue is not whether I have expertise in these areas but rather that the job is less about UX and more about the areas where “Expert Knowledge” is a must-have requirement.

Posted in UX Design | Tagged , , , , | 1 Comment

The necessity and risks of assumptions in UX.

By Bob Glaser ©2014

We make hundreds of assumptions every day. We have to make many of them to simply be productive. A very high percentage of them are reasonably accurate enough to meet the needs of any and every situation, often regardless of the situations importance.

The longer we live (given environmental and circumstantial uncertainties), the more we are likely to believe, with increasing certainty, that the assumptions we make are true. This is accurate up to a point. That point is when cognitive biases like confirmation bias, stop us from correcting errors.

Another cause of rationalized support for assumptions is the incidence of needing situation specific survival skills in the form of uniquely personal emotional tools may help an individual survive the high risk situation, but end up being destructive, once the risk is removed. Examples of these situations are children growing up in extremely abusive situations, a soldier returning from combat, an individual developing a drug dependency to address a situation that no longer exists like pain or environmentally triggered anxiety. I won’t be addressing these survival skill based assumptions because they vary too much in both cause and effect, from one individual to another.

Observing the User

As UX designers, we need to curb the use of assumption when silently observing the user. If we are doing a good job at observing, we are taking down facts of an interaction and set of observable behaviors. If there is any supposition based on assumption, it should be made after all the facts are recorded, and not during, since these assumptions are then highly likely to color progressive documentation of facts. We will begin to ignore those that don’t meet the assumption and placing undue weight on those that do meet the assumption. While this may seem obvious to most UX designers, it is often not followed methodically.

  1. Get the facts first
  2. Ask your predetermined survey questions next, separating:
    1. Qualitative data
    2. Quantitative data
    3. Ask your soft or open questions last.
    4. After all the participants are done, then you can start making and testing assumptions.
    5. Rework and iterate for next round.

It is a great example of how extra work done early can save a lot of time and money later. It is also gives you the opportunity of determining if a high success is due to the design working as intended (this is a very common ego based assumption) or if a high failure rate is caused by an assumed deficit. High speed, high iteration groups tend to make this methodical practice difficult to follow. What must be explained in terms of the project and ROI, is the importance of rigorous adherence to maintaining this clarity of distinction in terms of what assumptions were made.

Consider the following hypothetical where two successive pages of a form on a website are presented to the user. These two dialogues were the bottom of two successive form-pages. The later of the two was the last page in this three page form.

Bottom of page 2, below:

assumption1

Bottom of page 3, below:

assumption2

Given the lack of clarity that this causes, there are several possibilities of failure here.

  1. The repositioning of the buttons even though they are logically in the same order
    • Left to right followed by top to bottom: 1 Enter text, 2 continue, 3 cancel
    • The position of the “Continue” on the first page is where the “Cancel” is on the second page. (Maintaining common location but with contrary actions.)
  2. The changing of the wording.
    • The “Continue” button is two words on the first page but one word on the second page while the opposite is true with the “Cancel” button.
    • There are no words in the “Continue” button that match from page to page since “Submit” is used on the other page.
    • It may not be clear to some users that “Submit” may be for completion of the entire form and might be a dual action button that both saves responses and submits the form as completed. Some users may assume that there is no going back after submission, while others may assume that there will be a confirmation or possibility to edit responses afterwards.

The mistake commonly made by the UX designer, is considering one assumption instead of both (along with the possibility of others.) In spite of this being based on a real page that is currently in use by a major company, this is a somewhat severe example but sadly, not as uncommon as we’d like. There is an actual need to determine: “What is the most common reason for failure?”, based on factual observation. Then a reasonable follow-through would be to make sure that the target demographic has expectations that can be met by the redesign.

This leads next to the sweeping generalization of the type that is based on large group averaging and may need to be presented as such. It is acceptable make reasonable sweeping generalizations to make a point, but it isn’t acceptable when there are specific demographics that are being addressed. In those cases, the sweeping generalization could be flawed if the specific demographics total far less than a significant sample of all people. The sweeping generalization could also be flawed or if the sweeping generalization covers a range of objects of which the specific product being designed is only a small portion (e.g. business app. vs. consumer app.) or different paradigm (e.g. virtual vs. real object). There are many examples where a product is intentionally marketed and designed for an audience who is contrary to the general population despite the general population’s view of the target group. A good example here is the high-end audiophile consumer who wants tube based amplifiers and vinyl disk playing turntables for it’s ‘closer to real’ analog output and it is not because it’s ‘retro’ as the general population may perceive it.

The assumption of intended use

There is the example of the unintended function that is discovered or whose use is reimagined by the user and may surpass (exponentially) the intended function in the use of a product. This may come from need and basic adaptive use, like using a napkin folded to create a shiv to balance a rocking table that’s on an uneven surface. In this example, the problem and solution are obvious and the variable is in determining what is available in close proximity that will meet this need. Usually paper goods are chosen since they be folded to match the correct thickness (an additional sub variable) to solve the problem.

Take Facebook for example. Its original intent was to have a more narrowly focused user group of college students, in fact, it was narrower than that (first Harvard only, then Boston Colleges and Ivy League and Stanford.) It was restrictive by intent. Mark Zuckerberg and others recognized the monetization of the service by making it available to all. This was not accidental but rather smart observance of serendipity. There are countless other similar examples from children’s toys to pharmaceuticals.

The failures come from the apparent security of maintaining the status quo and not wanting to annoy the existing customers. Stay with the idea of “This is what we are known for”, so we can’t deviate from that path. Sometimes you must deviate and sometimes it’s adaptation by dropping a rigid business model to a more flexible one, changing the target demographic, adding/deleting/prioritization of functions, etc.

How to address the facts and assumptions

Be cynical. Don’t move forward thinking that the facts and assumptions are presumed correct.

  1. Remember: if you think that you’ve made no assumptions, then you’re already set up for problems unless you’re just plain lucky. Everyone make assumptions. Due diligence is required here.
  2. Clearly document and label assumptions and facts as different entities. This doesn’t have to be done in any complex way. Sometimes, something as simple as text formatting can help you keep these sorted.
  3. If errors show up later, go to your assumptions list first as this will save a lot of time. If the problem can’t be found there, then consider that there may be an error within the facts.

A personal example of effectively over-riding the common assumptions:

I managed to do something similar to these approaches when I was working in a small self-defined team. The numbers we were dealing with weren’t as massive as Facebook (few are) but they were exponentially greater than the original intent of the project we were working on. We fought diligently to solve a problem that no one perceived as a problem because of common group assumption. This was only a team of three, myself as the UX/UI Visual designer, a Human Factors Engineer to deal with the ergonomics of the machine, and an Industrial Designer.

The problem was identified when the mechanical engineering and R&D came up with some significant innovative and patented invention improvements and features. There was a proposal to make this device suitable for a larger segment by making it a scalable table top machine rather than a built in machine. As with all previous machines, it would use a simple black and white LED UI screen that gave feedback on settings and allowed for process counting (items completed by the machine) and would also require a trained dedicated user.

We proposed a color screen. In order justify the significantly increased cost of it. We had several weeks to come up with a rationale for its implementation. ROI, was the prime focus for a machine that would probably start at $20K for the simplest model.

We brainstormed, and ended up designing a prototype interface that would allow an untrained user to operate this machine. The primary assumption we were discarding was that these machines always have a dedicated user who is well trained. We had no budget beyond the time we spent in its development. So we had to discard common assumptions and make many new assumptions based on heuristics, experience, and simple user interviews. A bare bones user centered design. We gathered data from SW Engineering and Mechanical/Electrical Engineering to validate capabilities and information from Sales and Marketing via the marketing requirements document. We created a sample use case as our demo of user flow. One that would address the highest number of critical problems. We also considered a number of use cases so that we didn’t paint ourselves into a corner. This way, our sample use case in the form of a predefined workflow covered all of the top issues and would be factually defensible in presentation.

With the collaboration of my two colleagues, we designed the basic simple UI of a very complex system. I created the layout, graphic elements, and pseudo-prototype and with the help of the human factors engineer to determine the ergonomic considerations. The industrial designer on the project designed the first rough, of a product typically used on a factory floor, to look like it could look elegant in an office.

We presented our case to executives and major stakeholders. It passed and was implemented. Its success was demonstrated on the day it was released for sale (a few months after a completed prototype was demonstrated at a national convention.) Manufacturing had already built a projected 6 months’ supply of the products which sold out in 3 days. This was a product that ended up selling for between $25K to over $120K depending on the configuration. The majority of the sales were made based on the fact that a dedicated and trained operator would not be required for it.

This is not to say in any way that our solution was flawless, but it was magnitudes better than prior solutions on similar as well as more expensive machines.

Posted in UX Design, UX Strategy | Tagged , , , , , , , , | Leave a comment