u/Brighter_rocks

Adding more DAX to fix problems is usually the wrong move

It starts fine:

Store Count = DISTINCTCOUNT(Facts[StoreID])

Then a fix. Then a fix for the fix. Six months later there's a comment block that starts with "DO NOT CHANGE THIS" and nobody knows why.

Totals misbehaving is almost always a relationship problem - many-to-many with no bridge table. Add the bridge and the ISINSCOPE hack goes away. Measures filtering things they shouldn't need to filter is an ETL problem - if the fact table was clean, the measure wouldn't need to know about active records or deduplication at all.

If a measure needs to figure out which visual it's in before it can calculate anything, that's the model not doing its job.

Other stuff that belongs upstream

Mixed-grain fact table - split it at ETL, not with IF(ISINSCOPE(...))

No date dimension - add one. Stop calculating date logic inside measures

Attribute in the wrong table - move it in Power Query, don't LOOKUPVALUE it at query time on every row

Aggregation that belongs at the source - pre-aggregate at the warehouse, not in a measure that recalculates on every render

Multiple versions of the same measure for different visuals - the model isn't providing consistent structure

How to spot it

  • New requirements keep modifying existing measures instead of adding new ones
  • A simple question needs 30 lines to answer correctly
  • You can't explain what a measure does without also explaining which visual it was built for

Every workaround makes the next one harder. The person who wrote those comment blocks is gone, and now nobody knows why the overrides exist. Documentation doesn't fix that.

10 simple measures is easier to maintain for 3 years than 4 complex ones.

reddit.com
u/Brighter_rocks — 13 hours ago

The employment gap between college grads and everyone else is now the smallest it's been in 30 years. Your degree stopped being a moat

For decades, a bachelor's degree was the most reliable signal an employer could get. It didn't necessarily mean you were smart or capable - it meant you were capable enough to finish something hard, follow through, and operate in structured environments. That signal carried weight.

It's not carrying the same weight anymore.

The 2026 Human Edge report buries this finding without much fanfare, but it's worth sitting with: the employment advantage of a degree has eroded to a 30-year low. A combination of AI tools, skills-based hiring, and project-based work has made credentials less legible as a proxy for actual ability. Companies increasingly care what you can do in the next sprint, not what institution you attended six years ago.

For data analysts, this is a strange moment. The field spent years building academic legitimacy - statistics degrees, comp sci minors, master's programs in data science. And now the credential ladder is wobbling for everyone. If a degree no longer differentiates you, something else has to. And unlike fields where prestige still travels through institutions, analytics is one of the few areas where your actual output is usually visible and verifiable.

Which means analysts are weirdly well-positioned to adapt - if they treat their work as a portfolio rather than a job history.

A few things that seem to actually be filling the credential gap right now:

Work that lives somewhere public. An analysis posted somewhere, a write-up of a problem you solved, a dashboard someone can actually click through. Anything that lets a hiring manager evaluate your thinking directly instead of inferring it from a degree. In analytics specifically, the gap between "I know how to do this" and "here's proof I've done it" is smaller than in almost any other field.

Domain expertise, not just technical skills. Not "I know SQL and Python" - everyone knows SQL and Python. But "I've spent three years modeling retention in B2B SaaS" or "I know how e-commerce attribution actually breaks in practice" - that's specific enough to be useful and hard to fake.

Visibility inside a community. The analysts who get referred for interesting work aren't always the most credentialed. They're the ones who show up consistently in the places where the work gets discussed - a Slack group, a newsletter, a subreddit, a conference they've spoken at once.

The shift isn't that education stopped mattering. It's that the credential alone stopped being enough to carry you. The good news for analysts is that this field has always rewarded people who could show their work. Now the rest of the market is catching up to that logic

reddit.com
u/Brighter_rocks — 17 hours ago
▲ 4 r/Brighter+1 crossposts

Why BI teams get treated as report-monkeys

BI people often complain that business teams see them as “dashboard monkeys”. But if we’re completely honest, BI teams sometimes create this perception themselves.

Expample 1: Stakeholder: “Can you send me campaign performance data?” BI: sends CSV export.

WOW!

Now the stakeholder has: another spreadsheet, another version of the metric, another manually built report.

Two weeks later everybody asks why numbers don’t match across dashboards. Well, because nobody stopped to ask: “What are you actually trying to decide?”

Example 2: Business: “We need to understand why retention dropped.” BI: starts explaining joins, dbt models, refresh logic, attribution definitions, filter behavior.

But nobody answers the real business question.

A lot of BI communication is technically accurate - but its the accuracy that hides analyst from business problem.

Example 3: Stakeholder: “Can we visualize how revenue changed from last quarter?”

BI: “Technically Power BI/Tableau doesn’t support this natively…”

What could actually work:

Option 1: stacked bar with a running total column - builds in 20 minutes, works for most stakeholders, no custom visuals needed

Option 2: custom visual from AppSource - looks exactly like a waterfall, takes a couple of hours, harder to maintain when the data model changes

That's the answer. Two options, tradeoffs stated, stakeholder picks. The "not supported natively" part is irrelevant to them.

reddit.com
u/Brighter_rocks — 8 hours ago

RTO mandates won't be killed by employees. Climate and geopolitics will do it first

New workforce research predicts that mandatory return-to-office policies will become "virtually unenforceable" by the early 2030s - because climate disruptions and geopolitical instability will make centralized offices logistically impossible. The distributed work era isn't ending. It's just on a very awkward pause.

Which means the people quietly building remote-ready skills and habits right now are positioning themselves better than they probably realize.

The logic in the report is straightforward: extreme heat events, flooding, and supply chain disruptions are already forcing ad hoc remote arrangements across industries. Geopolitical instability is making certain office locations genuinely risky. Companies that invested heavily in "return to office" infrastructure are going to find themselves holding a very expensive assumption. And knowledge workers - analysts, developers, strategists - are the most natural candidates for distributed work when that shift accelerates again.

A few things worth thinking about if you're in this space:

Your async communication skills matter more than your in-person presence. The people who thrive in distributed teams aren't just comfortable working alone - they write clearly, document decisions, and don't create bottlenecks that require a meeting to resolve.

Building a strong external reputation now is cheap insurance. When the office isn't the default, visibility inside your company becomes harder to maintain. People who have a presence outside it - a portfolio, a community, a track record that lives somewhere other than internal Slack - have more options.

Remote infrastructure is a skill. Knowing how to run a distributed project, across time zones, with clear ownership and minimal coordination overhead, is genuinely hard. The people who've figured it out will be in demand when the next wave of distributed work hits.

The companies currently fighting hardest for RTO aren't winning a culture war. They're burning goodwill on a policy that the next decade will quietly retire anyway

reddit.com
u/Brighter_rocks — 4 days ago
▲ 14 r/Brighter+1 crossposts

Companies that replace humans with AI entirely are going to crash. A major report basically confirms it

The 2026 Human Edge report is pretty direct about it: blind automation is the wrong path and will lead to business failures, and very quiclek.

Most AI failures aren't dramatic. The model doesn't go rogue. It just produces something slightly wrong, nobody catches it, and that error compounds through a pipeline until it's a real problem - a bad forecast, a flawed report, a decision built on garbage data. The failure is the missing human who would have noticed.

Which is why the role actually gaining value right now isn't "prompt engineer." It's the person with enough domain expertise to sense when an output doesn't smell right, even before they can explain why.

A few things that genuinely help here:

Knowing which errors to expect from which systems. LLMs hallucinate. Recommendation models amplify existing bias. Forecasting models quietly drift when the underlying data changes. These aren't random failures - they're predictable ones.

Domain depth over tool fluency. The people who catch AI mistakes aren't always the best at using AI. They're the ones who know the subject matter well enough to notice when something is off.

The companies that will struggle aren't the slow adopters. They're the ones moving fast while hollowing out the human expertise that made their outputs trustworthy. By the time they realize it, that institutional knowledge is already gone

reddit.com
u/Brighter_rocks — 4 days ago
▲ 41 r/Brighter+2 crossposts

A new global workforce report found that 39% of core job skills will change by 2030 - and the fastest-growing ones aren't technical. They're complex problem-solving, intuition, cognitive flexibility, and creativity. The things we used to dismiss as "soft skills" or "pre-industrial" are becoming the actual competitive edge in an AI-saturated market.

Think about what that means. The last two years were dominated by anxiety about Python, SQL, and whether your job title would survive the next model release. Meanwhile, the skills that are quietly becoming irreplaceable are the ones algorithms still can't fake: genuine curiosity, the ability to reframe a problem, knowing what question to ask before you run the analysis.

The report is pretty direct about why: true AI literacy won't mean less thinking. It will require more. Someone still has to decide what the output means, whether to trust it, and what to do when it's confidently wrong. That someone needs judgment - not just prompts.

The people who spent 2023 and 2024 optimizing for tool fluency may have been solving the wrong problem entirely.

reddit.com
u/Brighter_rocks — 7 days ago

I’ve just read the Gartner report on AI & Data predictions for 2026 (maybe i shouldnt have) and honestly… it feels a bit schizophrenic.

Like everyone is out there colonizing the future - AI agents, autonomous decisioning, fully data-driven orgs - while in my company we still can’t fix basic master data.

We’re talking about self-healing systems, but our product hierarchies are still broken and no one trusts the numbers.

Feels like we’re skipping a few steps.

reddit.com
u/Brighter_rocks — 22 days ago