VISUAL GUIDE
Cognitive bias in IT
10 traps for IT managers to avoid
IT leaders today face a unique challenge. Not only do they need to buy and build the right software and workflows, they have to understand how that tech is going to drive value for users and advance key business goals.
Left unchecked, cognitive biases can lead to misalignment between the software experiences you build and what users want and the business needs. Do you know how cognitive bias may be affecting how you work or make IT decisions?
Column image

Sunk-cost fallacy

Following through on a project, workflow, or feature that may not be right—or may even be doomed to fail—simply because one has already put so much work into it.

Bias in the wild

A company has been building an internal customer relationship management (CRM) system for over two years. Despite repeated delays, cost overruns, and mounting evidence that commercial alternatives are the better choice, it decides to double down on the project because so much time and money has already been spent.

Column image

Halo effect

Building a positive holistic picture of a path, product, or person based on one or a few traits that don’t justify doing so.

Bias in the wild

During the evaluation of a new business tool, IT managers overlook critical shortcomings—like poor integration with internal systems and lack of enterprise-grade permissions—because they’re overly impressed by the vendor's brand and reputation.

Column image

Authority bias

Privileging the opinions or judgements of someone in a position of authority and giving them unmerited weight.

Bias in the wild

An IT team is deciding how to best set up a new internal tool. A junior engineer suggests a new approach that would save time and effort. But the CIO casually mentions a more traditional setup they’ve used before. Even though it’s more complex and slower to implement, the team goes with the CIO’s suggestion.

Column image

Bandwagon effect

Prioritizing a point of view or direction based on the number of people voicing support for it.

Bias in the wild

An IT team for a small business is looking for a communication tool for its business. Employee feedback says that “everyone uses,” a given app, so they pick it even though it means they’re paying for features they don’t use, and a simpler (or free) alternative would’ve worked just as well.

Column image

Confirmation bias

Seeking out sources, points of view, and supporting materials that confirm one’s prior view or stance on an issue.

Bias in the wild

After a company updates its internal HRIS software, some employees start saying, “This update made everything slower.” An IT manager who already disliked the vendor agrees and starts collecting complaints. Even when other evidence points to the slowdown being due to a surge in network usage from another system, the manager insists the update is the root cause and pushes to roll it back.

Column image

Hindsight bias

The tendency to believe that the outcome of a decision was more predictable than it actually was at the time it was made.

Bias in the wild

After a team launches a new calendar integration feature, users start reporting issues with event syncing. During the postmortem, an IT manager criticizes their team for not anticipating the integration complexity causing issues. But before the release, no one flagged major concerns, and the testing didn’t reveal any critical issues.

Column image

Availability heuristic

Giving too much weight to information that’s top of mind or easily accessible when making IT decisions.

Bias in the wild

An engineering team is considering a cloud storage service for a new product. One engineer immediately argues against using the tool, arguing that it caused major issues  The team quickly agrees, even though the problem last time was due to misconfiguration, not the tool itself.

Column image

Ostrich effect

Choosing to ignore information that threatens our preferred way of doing things.

Bias in the wild

An IT team knows their monitoring system is showing a slow but steady increase in server errors from their tool’s API. But the errors aren’t causing obvious user complaints—yet. Under pressure to meet a release deadline, the team decides to avoid investigating until after launch. Eventually, the issue escalates and causes a major outage that takes longer to fix because it went unchecked.

Column image

Clustering illusion

Spotting a pattern in the data or signal in the noise where there is none.

Bias in the wild

A support engineer notices that three user tickets in the same week came from customers in the same region reporting login problems. They raise an alarm: “There must be a regional issue!” IT begins investigating geo-specific network and localization bugs—only to find out that similar issues were also reported that week by users from other countries. It was just a coincidence.

Column image

Dunning-Kruger effect

The tendency for individuals with limited knowledge or skills to overestimate their abilities in a given area.

Bias in the wild

A project manager is leading a migration from an on-prem server to the cloud. After attending a few high-level overview sessions, they create a detailed timeline and confidently tell stakeholders it can be done in two weeks. But they don’t consult the sysadmins or cloud engineers about hidden complexities, service dependencies, and rollback plans. The migration ends up taking over a month, causes unexpected downtime, and frustrates the team.

Download the PDF
4 ways to break free from cognitive bias in IT
1. Acknowledge your bias
We are all biased as humans, and that bias can trickle down into the AI systems we create. The important thing is to recognize that our biases exist and work ways of addressing them into your IT strategy model.

Where to start? Build a culture of feedback, iteration, and experimentation within and across your IT teams. Welcome all points of view. Let data guide your decision-making.
2. Seek out the facts
It’s unrealistic to think only the right data will always be your guiding star on every IT decision. But do your best to spot biases and rely on data-driven insights where possible.

Leverage application, workflow, and AI agent analytics to understand the user journey. Examine your users by segments – are you seeing patterns in key metrics like support ticket resolution time? This allows you to potentially spot a pattern indicating whether needs or concerns of different segments are going unaddressed and whether user priorities align with your own.
3. Explore other perspectives
Feedback is a gift, no matter what it’s telling you. Don’t just focus on employees who are already happy. Get fresh points of view from multiple areas, including frustrated departments or employees.

Build a complete picture through soliciting regular feedback: poll and survey your users where and when it matters most (when they’re in the tools and workflows themselves) to determine what’s working and what’s not. Proactively target user segments whose voice may be underrepresented in IT planning meetings.
4. Iterate, iterate, iterate
Being data driven is necessary for an IT team to succeed. But it’s not sufficient. Be curious. Always be testing new approaches. But be clear about what you're trying to learn and what "success" looks like, including positive and negative outcomes. Be specific about your KPIs and get relevant business stakeholders aligned. Try to challenge your own ideas as much as possible, and if there’s data that supports your own point of view, verify that it’s relevant and statistically significant.
Cognitive biases can be tricky to combat.
 But for IT teams, there's a clear path forward.
Découvrez Pendo pour l’informatique

Envie de découvrir comment Pendo peut aider votre équipe commerciale ?

« Les guides permettent à notre équipe de passer moins de temps à rechercher des informations et plus de temps à ajouter de la valeur à chaque interaction avec chaque client. »

– Amy Beal, directrice de l'apprentissage commercial chez Ferguson

Obtenir une démo

Foire aux questions