1 min readfrom Microsoft Excel | Help & Support with your Formula, Macro, and VBA problems | A Reddit Community

advice Excel cleanup approach

Need advice on whether my Excel cleanup approach was the best solution

I was asked at work to modify an Excel table with 10 columns. Half of the columns contained company-related data, while the other half contained agent-related data.

The requirement was a bit specific:

Company rows could still repeat and needed to stay in the dataset.

But the agent-side data should not be counted multiple times if it was duplicated, because it was affecting totals and making the agent calculations inaccurate.

What I ended up doing was:

Using the agent-related text columns to identify duplicate rows.

If a row was considered a duplicate from the agent side, I set the quantity/numeric values for the duplicated agent data to 0.

After that, I made those duplicate cells white in Excel so they wouldn’t stand out visually.

It works for the totals/calculations now, but I’m wondering if this was actually a good approach or if there’s a cleaner/more professional way to handle this in Excel or Power Query?

submitted by /u/Resident_Quantity827
[link] [comments]

Want to read more?

Check out the full article on the original site

View original article

Tagged with

#Excel alternatives for data analysis
#generative AI for data analysis
#Excel compatibility
#Excel alternatives
#big data management in spreadsheets
#conversational data analysis
#real-time data collaboration
#intelligent data visualization
#data visualization tools
#enterprise data management
#big data performance
#data analysis tools
#data cleaning solutions
#rows.com
#natural language processing for spreadsheets
#large dataset processing
#row zero
#financial modeling with spreadsheets
#Excel
#cleanup