Great examples of the fallacy Don Norman points out in "The design of everyday things":
Assigning "human error" and declaring the analysis done, not considering design may be at fault.
Just Ctrl+F the keywords, the exact phrase is used three times and paraphrased a dozen time more. My favorite is “It was basically human error… there’s nothing wrong with our accounting systems”. If your spreadsheet has billion-dollar impact, why is there no 4-eyes-principle? Why do humans even have a hand in data transfer? No sanity checks? No automation? Stop blaming the user!
This. I read maybe half a dozen of the stories and just kept thinking “they don’t have great quality control measures in place”. Like you said four eyes principle.
I've thought a lot about this. It's certainly easy to dunk on spreadsheets being error-prone (and even easier to dunk on VBA!). It's a decades-long running joke: https://dilbert.com/strip/2007-08-08
But at a higher level, what strikes me is that spreadsheets don't get checked if they tell you what you want to hear. These stories are generally about afflicting the afflicted (Reinhart and Rogoff) and banks and traders taking on too much global risk in pursuit of local reward (the London Whale).
This is a problem with the intersection of software and society in general, and I don't have an answer.
One somewhat spreadsheet-related error I experienced firsthand was when most people in the class I was in got the wrong grade for the final. The professor had ordered the students alphabetically in a spreadsheet, then copied their grades from there to the website. But one student had a name with a non-ascii character, and the spreadsheet and the website disagreed on on the alphabetical ordering in that case, so most students got the grade of the person alphabetically next to them.
It seems a lot of these come down to the fact that people don’t do a final sanity check. I have seen it quite a few times where people threw out numbers in meetings that simply didn’t make sense but weren’t challenged.
We've seen this in customer deployments as well [1], that's why we always recommend UIs linked to a spreadsheet that help with verification & correctness of the underlying spreadsheet logic.
[1] Implementations of MintData spreadsheets for internal tooling & line of business applications.
Say what you will about errors in spreadsheets but an underappreciated benefit is that they enable this kind of post-hoc analysis/error checking.
The transparency of 'it's all in this workbook and you can trace it yourself' means finding mistakes (and there are always going to be mistakes) and finding the logic behind the conclusions (because there is always going to be debate about methodology and cleaning practices) is a million times easier than if the analysis was done in code.
It may be the case that you get more errors per hour in a spreadsheet than in code, but I'd bet 5:1 that errors in code based systems persist much longer, due to the lack of transparency in what the machine is thinking.
But spreadsheet logic is buried unseen in cells and spreadsheets don't have good version control for reviewing new changes and auditing change history.
Yes, but to do good version control on code based systems you need to version both the code, and the data being run through the pipes.
While there are lots of good tools to version code (git etc), and lots of tools to version control data (bitemporality being the usual requirement), does anyone know a tool which allows you to do both, at the same time?
Grist spreadsheets have a page where you can review all formulas as code https://support.getgrist.com/formulas/#code-viewer - I find this handy when figuring out someone else's spreadsheet, or my own once I've forgotten how they work, and would love to see this become a standard spreadsheet feature (disclosure: I work on Grist)
As someone that spent many countable (billable) hours doing Excel audits--yes there is.
You build them yourself.
The fallacy that you're bringing to Excel is pretending that it's a "all-and-be-all" in a package.
It's a tool with an embedded programming environment. Much like you'd take a basic Python install, and add on additional tools, the same applies to Excel.
The embedded programming language (VBA) is a fantastic language--it has absolutely no libraries out of the box which makes it a poor development /environment/.
To directly address your question:
- include a "version" cell on a "Config" worksheet (optionally include values for "audited by", "audited date", "audit version", "audit type", note this could be a multirow, so you could actually specify "full" audits, and mini audits)
- build a function that maps all formulas to an "Audit" worksheet
- build a function to export to CSV with the date and version tagged
- build a function to export all code modules to text files
- create whatever release-versioning you want around this: zip of exports + "deployment" XLS, GIT repo with tagged version and exports and a binary blob of the XLS, etc, etc.
(Edit: workbook vs worksheet; each tab in a workbook is a worksheet.)
Like a lot of things, I think you have two part here. A small spread can be more understandable than a small program because you have all the logic apparently visible. A large spreadsheet can become harder to understand and verify than the medium-sized program that could duplicate it's logic.
While I enjoy snarking at amateurs‘ mistakes as much as anybody, I believe it’s an overlooked fact that nothing has been as successful in getting non-programmers to program, and in getting the benefits of programming out there, as spreadsheets have.
Excel gets almost as much grieve as PowerPoint does. But I’ve met 80-year old booksellers running their own demand-forecasting spreadsheets, which were more sophisticated and accurate than quite a few „business intelligence“ portals I’ve seen over the years. And isn’t that awesome?
It's incredible what you can achieve with spreadsheets. Done well these spreadsheets are also a perfect basis for further digitalisation, simply because the underlying process is running smoothly.
Not that the authors would claim any differently, but I am going to go out on a limb and state that RDBMS Horror Stories are more frequent, and with higher overall impact.
> In a paper, 'Does High Public Debt Consistently Stifle Economic Growth? A Critique of Reinhart and Rogoff,' Thomas Herndon, Michael Ash, and Robert Pollin of the University of Massachusetts, Amherst criticise a 2010 paper by Harvard economists Carmen Reinhart and Kenneth Rogoff, 'Growth in a Time of Debt.' They find three main issues: First, Reinhart and Rogoff selectively exclude years of high debt and average growth. Second, they use a debatable method to weight the countries. Third, there also appears to be a coding error that excludes high-debt and average-growth countries. All three bias in favor of their result, and without them you don't get their controversial result."
This one is kinda buried in the list. This is what motivated the harsh austerity measures in Greece. So an entire country was punished because of a bug in a spreadsheet.
> So an entire country was punished because of a bug in a spreadsheet.
Not really, the Authors of the spreadsheet made one or more errors, AND they did not double check results, NOR any of the top-level economists criticized the results.
In other words an entire country was (severely) punished because of a wrong theory (based on erroneous data) that all the establishment accepted because it came from Harvard.
Sounds a lot like the theory was accepted without being verified because it stated what they (the ones pushing for austerity) wanted it to say. Not disagreeing with your list but maybe it's worth adding the politicians/decision makers involved to it.
Sure, by "establishment" I intended the politicians too.
Still the main responsability sits on the economists and their peers.
I mean, not that I want in any way to absolve the politicians, mind you, but what can they do - not being economists or more generally technically experts in this or that field - if not choosing the theories from people qualified by notorious universities?
In my experience what matters in this is reputation (and network), which in itself is not "bad" in an absolute sense, but I see a lot of sloppy or downright "wrong" works (even if rarely with such severe consequences) coming from these people lately.
It seems to me like there are not (or not enough) critical reviews/checkings/etc. in some (scientific or pseudo-scientific) circles.
The broad theory is that austerity is necessary for growth - it's a serious theory even if I think it turned out to be wrong with terrible human consequences.
The Reinhart and Rogoff paper was more like factoid of a theory - no complicated model, just a dubious claim that trips off the tongue - "Studies have shown that when debt reach X level, growth tanks"
The thing about this is when the shoddy quality of the study became obvious, people could act that was what drove all the horrible things coming under austerity, as if these weren't the consensus of a large group, study-or-not-study.
The article addresses the issues raised by OP. You don't need to read the paywalled article since it just summarizes the statement by Reinhart & Rogoff which is available here [1]. Here is an archive of the actual article if you would prefer to read that [2].
So before the corrections the growth was -0.1 percent for debt levels over 90 percent. After their corrections the growth was 2.2 percent. And then they try to pretend that that doesn’t really change the main conclusion of their paper, which was that high debt levels destroy growth.
It doesn't look good for Reinhart and Rogoff, but if it hadn't been their paper it would have been something else. The political will existed in Europe to subject Greece to extreme measures; it was going to happen. If Germany hadn't found this scapegoat they would have found some other one.
And people claim to wonder why Greece is friendly with China...
>This one is kinda buried in the list. This is what motivated the harsh austerity measures in Greece. So an entire country was punished because of a bug in a spreadsheet.
That isn't true. This paper indicated that moderately high levels of public debt constrained economic growth. As you have pointed out, the paper was riddled with errors and the evidence that moderately high levels of public debt slow economic growth isn't really there.
What happened in Greece is quite different. Lenders lost confidence in the Greek government's ability to repay their debt. Partially because there was re-statement of public debt that increased the debt level by 11% overnight.
What the Greek government did after that to get out from under this was to cut spending (austerity). Economists have always been sceptical that this would work since during a recession is the very worst time to cut spending since it reduces cumulative demand at a time when it is already down.
Indeed it did not work as we all know, but Eurozone lenders were not willing to allow a partial default which is what should have happened and being in the Euro means that there is no possibility for a devaluation.
Well, basically the EU forced harsh measures on Greece so Greece could get money from the EU to repay debts to banks and other institutional investors. Investors didn't loose anything (over simplifying here), the EU got it's money back and Greece, along with Greeks, paid the bill. Would have been easier to just wire the money over to the investors directly and leave the Greeks alone.
Investors understood the rescue of Greece exactly like that anyway if you ask me. But maybe the EU had to maintain the public impression of austerity. Or rather the Germans.
Greece consistently ran huge deficits during the crisis. They had a huge debt write-off in 2012 and still ended up with 180% debt/gdp, that doesn't happen if your spending policy is austere.
That's something curious about the UK at the moment - borrowing is huge, like the Tory government has borrowed more since they took over than the country borrowed previously in history, ever, (according to a stat I heard, but the actual figures are huge according to the Nat.Stats.Off too; this isn't just hearsay) and yet we've been going through massive austerity.
Spending on services has been drastically cut, but the country has been borrowing vastly more money. The only way I see that working is if taxes have plummeted (but taxes for ordinary folk have risen slightly if anything) or if the money is being syphoned off somewhere ... but we're talking GDP levels of cash here.
Austerity is effective when it is achieved by cutting spending. The cases where it hasn't worked were because the spending cuts were accompanied with tax increases [1]. This shouldn't be a surprise because tax increases tend to decrease GDP [2].
[1] A. Alesina, C. A. Favero, and F. Giavazzi, Austerity: when it works and when it doesnt. Princeton (New Jersey): Princeton University Press, 2019.
[2] C. D. Romer and D. H. Romer, “The Macroeconomic Effects of Tax Changes: Estimates Based on a New Measure of Fiscal Shocks,” American Economic Review, vol. 100, no. 3, pp. 763–801, 2010.
They are a thing [1]. We use MintData spreadsheets for a large part of our regression test suite that tests MintData itself.
A bit meta, but yes, spreadsheets can definitely be used for the equivalent of what "unit tests" are in code.
Interestingly, we also have "integration tests" in our spreadsheets, but this has more to do with the fact that MintData spreadsheets have native API calling ability, so we can test with external services end-to-end.
I wonder how those mistakes were caught - I guess it is hard for them to track since they seem to rely on news reports.
If it is usually some audit firm, seems like it is working somewhat as intended, at least in the business related ones. - in other cases like research it might be important enough to hire an audit firm to verify your spreadsheet model/data provenance.
Great examples of the fallacy Don Norman points out in "The design of everyday things":
Assigning "human error" and declaring the analysis done, not considering design may be at fault.
Just Ctrl+F the keywords, the exact phrase is used three times and paraphrased a dozen time more. My favorite is “It was basically human error… there’s nothing wrong with our accounting systems”. If your spreadsheet has billion-dollar impact, why is there no 4-eyes-principle? Why do humans even have a hand in data transfer? No sanity checks? No automation? Stop blaming the user!
This. I read maybe half a dozen of the stories and just kept thinking “they don’t have great quality control measures in place”. Like you said four eyes principle.
I've thought a lot about this. It's certainly easy to dunk on spreadsheets being error-prone (and even easier to dunk on VBA!). It's a decades-long running joke: https://dilbert.com/strip/2007-08-08
But at a higher level, what strikes me is that spreadsheets don't get checked if they tell you what you want to hear. These stories are generally about afflicting the afflicted (Reinhart and Rogoff) and banks and traders taking on too much global risk in pursuit of local reward (the London Whale).
This is a problem with the intersection of software and society in general, and I don't have an answer.
> But at a higher level, what strikes me is that spreadsheets don't get checked if they tell you what you want to hear.
https://dilbert.com/strip/2016-01-07
Further: https://dilbert.com/search_results?terms=spreadsheet
"There are only two kinds of languages: the ones people complain about and the ones nobody uses."[1] That applies to spreadsheets as well as to C++.
[1] http://www.stroustrup.com/bs_faq.html#really-say-that
Ray Panko of the University of Hawaii has studied the problem (and prevalence) of spreadsheet errors for decades:
https://www.researchgate.net/profile/Ray_Panko
https://web.archive.org/web/20070617035246/http://panko.shid...
Back in 1987 I worked at a company that used Microsoft Multiplan, which was an early spreadsheet program.
It did not have a "sure you want to exit?" nor any form of intelligent saving.
One of the guys worked on a spreadsheet all night without saving it along the way and hit exit, and it did.
https://en.wikipedia.org/wiki/Multiplan
One somewhat spreadsheet-related error I experienced firsthand was when most people in the class I was in got the wrong grade for the final. The professor had ordered the students alphabetically in a spreadsheet, then copied their grades from there to the website. But one student had a name with a non-ascii character, and the spreadsheet and the website disagreed on on the alphabetical ordering in that case, so most students got the grade of the person alphabetically next to them.
It seems a lot of these come down to the fact that people don’t do a final sanity check. I have seen it quite a few times where people threw out numbers in meetings that simply didn’t make sense but weren’t challenged.
We've seen this in customer deployments as well [1], that's why we always recommend UIs linked to a spreadsheet that help with verification & correctness of the underlying spreadsheet logic.
[1] Implementations of MintData spreadsheets for internal tooling & line of business applications.
https://mintdata.com
Say what you will about errors in spreadsheets but an underappreciated benefit is that they enable this kind of post-hoc analysis/error checking.
The transparency of 'it's all in this workbook and you can trace it yourself' means finding mistakes (and there are always going to be mistakes) and finding the logic behind the conclusions (because there is always going to be debate about methodology and cleaning practices) is a million times easier than if the analysis was done in code.
It may be the case that you get more errors per hour in a spreadsheet than in code, but I'd bet 5:1 that errors in code based systems persist much longer, due to the lack of transparency in what the machine is thinking.
How is a spreadsheet easier to audit than a codebase?
All data states and all logic are in one file
But spreadsheet logic is buried unseen in cells and spreadsheets don't have good version control for reviewing new changes and auditing change history.
Yes, but to do good version control on code based systems you need to version both the code, and the data being run through the pipes.
While there are lots of good tools to version code (git etc), and lots of tools to version control data (bitemporality being the usual requirement), does anyone know a tool which allows you to do both, at the same time?
Git LFS?
Grist spreadsheets have a page where you can review all formulas as code https://support.getgrist.com/formulas/#code-viewer - I find this handy when figuring out someone else's spreadsheet, or my own once I've forgotten how they work, and would love to see this become a standard spreadsheet feature (disclosure: I work on Grist)
But I can review my code as code?
As someone that spent many countable (billable) hours doing Excel audits--yes there is.
You build them yourself.
The fallacy that you're bringing to Excel is pretending that it's a "all-and-be-all" in a package.
It's a tool with an embedded programming environment. Much like you'd take a basic Python install, and add on additional tools, the same applies to Excel.
The embedded programming language (VBA) is a fantastic language--it has absolutely no libraries out of the box which makes it a poor development /environment/.
To directly address your question:
- include a "version" cell on a "Config" worksheet (optionally include values for "audited by", "audited date", "audit version", "audit type", note this could be a multirow, so you could actually specify "full" audits, and mini audits)
- build a function that maps all formulas to an "Audit" worksheet
- build a function to export to CSV with the date and version tagged
- build a function to export all code modules to text files
- create whatever release-versioning you want around this: zip of exports + "deployment" XLS, GIT repo with tagged version and exports and a binary blob of the XLS, etc, etc.
(Edit: workbook vs worksheet; each tab in a workbook is a worksheet.)
Like a lot of things, I think you have two part here. A small spread can be more understandable than a small program because you have all the logic apparently visible. A large spreadsheet can become harder to understand and verify than the medium-sized program that could duplicate it's logic.
Yes. I wonder whether it is feasible to run unit tests in spreadsheets: they are essentially long chains of functions.
Debugging a larger sheet seems really tough.
While I enjoy snarking at amateurs‘ mistakes as much as anybody, I believe it’s an overlooked fact that nothing has been as successful in getting non-programmers to program, and in getting the benefits of programming out there, as spreadsheets have.
Excel gets almost as much grieve as PowerPoint does. But I’ve met 80-year old booksellers running their own demand-forecasting spreadsheets, which were more sophisticated and accurate than quite a few „business intelligence“ portals I’ve seen over the years. And isn’t that awesome?
It's incredible what you can achieve with spreadsheets. Done well these spreadsheets are also a perfect basis for further digitalisation, simply because the underlying process is running smoothly.
Not that the authors would claim any differently, but I am going to go out on a limb and state that RDBMS Horror Stories are more frequent, and with higher overall impact.
Purely from a SQLi point of view we have: https://codecurmudgeon.com/wp/sql-injection-hall-of-shame/
Or programming in general horror stories: http://thedailywtf.com/
> In a paper, 'Does High Public Debt Consistently Stifle Economic Growth? A Critique of Reinhart and Rogoff,' Thomas Herndon, Michael Ash, and Robert Pollin of the University of Massachusetts, Amherst criticise a 2010 paper by Harvard economists Carmen Reinhart and Kenneth Rogoff, 'Growth in a Time of Debt.' They find three main issues: First, Reinhart and Rogoff selectively exclude years of high debt and average growth. Second, they use a debatable method to weight the countries. Third, there also appears to be a coding error that excludes high-debt and average-growth countries. All three bias in favor of their result, and without them you don't get their controversial result."
This one is kinda buried in the list. This is what motivated the harsh austerity measures in Greece. So an entire country was punished because of a bug in a spreadsheet.
http://theconversation.com/the-reinhart-rogoff-error-or-how-...
https://prospect.org/culture/books/the-crash-of-austerity-ec...
https://www.theguardian.com/business/ng-interactive/2015/apr...
https://www.thestranger.com/slog/archives/2013/04/17/how-mic...
http://www.cc.com/video-clips/dcyvro/the-colbert-report-aust...
interview with herndon http://www.cc.com/video-clips/kbgnf0/the-colbert-report-aust...
> So an entire country was punished because of a bug in a spreadsheet.
Not really, the Authors of the spreadsheet made one or more errors, AND they did not double check results, NOR any of the top-level economists criticized the results.
In other words an entire country was (severely) punished because of a wrong theory (based on erroneous data) that all the establishment accepted because it came from Harvard.
Sounds a lot like the theory was accepted without being verified because it stated what they (the ones pushing for austerity) wanted it to say. Not disagreeing with your list but maybe it's worth adding the politicians/decision makers involved to it.
Sure, by "establishment" I intended the politicians too.
Still the main responsability sits on the economists and their peers.
I mean, not that I want in any way to absolve the politicians, mind you, but what can they do - not being economists or more generally technically experts in this or that field - if not choosing the theories from people qualified by notorious universities?
In my experience what matters in this is reputation (and network), which in itself is not "bad" in an absolute sense, but I see a lot of sloppy or downright "wrong" works (even if rarely with such severe consequences) coming from these people lately.
It seems to me like there are not (or not enough) critical reviews/checkings/etc. in some (scientific or pseudo-scientific) circles.
The broad theory is that austerity is necessary for growth - it's a serious theory even if I think it turned out to be wrong with terrible human consequences.
The Reinhart and Rogoff paper was more like factoid of a theory - no complicated model, just a dubious claim that trips off the tongue - "Studies have shown that when debt reach X level, growth tanks"
The thing about this is when the shoddy quality of the study became obvious, people could act that was what drove all the horrible things coming under austerity, as if these weren't the consensus of a large group, study-or-not-study.
Proper response to this:
https://www.ft.com/content/01fc06b8-fb6e-3e36-acb0-a1f8b47a7...
It remains true that high debt ratios hamper growth.
No. Your link is behind a paywall, but I followed the debate during they years, and their results are quite thoroughly debunked.
The article addresses the issues raised by OP. You don't need to read the paywalled article since it just summarizes the statement by Reinhart & Rogoff which is available here [1]. Here is an archive of the actual article if you would prefer to read that [2].
[1] https://scholar.harvard.edu/files/rogoff/files/response_to_h...
[2] https://archive.md/2daXH
So before the corrections the growth was -0.1 percent for debt levels over 90 percent. After their corrections the growth was 2.2 percent. And then they try to pretend that that doesn’t really change the main conclusion of their paper, which was that high debt levels destroy growth.
It doesn't look good for Reinhart and Rogoff, but if it hadn't been their paper it would have been something else. The political will existed in Europe to subject Greece to extreme measures; it was going to happen. If Germany hadn't found this scapegoat they would have found some other one.
And people claim to wonder why Greece is friendly with China...
>This one is kinda buried in the list. This is what motivated the harsh austerity measures in Greece. So an entire country was punished because of a bug in a spreadsheet.
That isn't true. This paper indicated that moderately high levels of public debt constrained economic growth. As you have pointed out, the paper was riddled with errors and the evidence that moderately high levels of public debt slow economic growth isn't really there.
What happened in Greece is quite different. Lenders lost confidence in the Greek government's ability to repay their debt. Partially because there was re-statement of public debt that increased the debt level by 11% overnight.
What the Greek government did after that to get out from under this was to cut spending (austerity). Economists have always been sceptical that this would work since during a recession is the very worst time to cut spending since it reduces cumulative demand at a time when it is already down.
Indeed it did not work as we all know, but Eurozone lenders were not willing to allow a partial default which is what should have happened and being in the Euro means that there is no possibility for a devaluation.
Well, basically the EU forced harsh measures on Greece so Greece could get money from the EU to repay debts to banks and other institutional investors. Investors didn't loose anything (over simplifying here), the EU got it's money back and Greece, along with Greeks, paid the bill. Would have been easier to just wire the money over to the investors directly and leave the Greeks alone.
Investors understood the rescue of Greece exactly like that anyway if you ask me. But maybe the EU had to maintain the public impression of austerity. Or rather the Germans.
Greece consistently ran huge deficits during the crisis. They had a huge debt write-off in 2012 and still ended up with 180% debt/gdp, that doesn't happen if your spending policy is austere.
https://3gp11q1ujq964apmpt3s9cda-wpengine.netdna-ssl.com/wp-...
Austerity is not measured in Euros but in the human casualties of the policy. https://en.wikipedia.org/wiki/Greek_government-debt_crisis#S...
That's something curious about the UK at the moment - borrowing is huge, like the Tory government has borrowed more since they took over than the country borrowed previously in history, ever, (according to a stat I heard, but the actual figures are huge according to the Nat.Stats.Off too; this isn't just hearsay) and yet we've been going through massive austerity.
Spending on services has been drastically cut, but the country has been borrowing vastly more money. The only way I see that working is if taxes have plummeted (but taxes for ordinary folk have risen slightly if anything) or if the money is being syphoned off somewhere ... but we're talking GDP levels of cash here.
Anyone explain it to me?
Well, realistically many people are dying from spreadsheet incompetence (also named "clinical research") every day, so...
Austerity is effective when it is achieved by cutting spending. The cases where it hasn't worked were because the spending cuts were accompanied with tax increases [1]. This shouldn't be a surprise because tax increases tend to decrease GDP [2].
[1] A. Alesina, C. A. Favero, and F. Giavazzi, Austerity: when it works and when it doesnt. Princeton (New Jersey): Princeton University Press, 2019.
[2] C. D. Romer and D. H. Romer, “The Macroeconomic Effects of Tax Changes: Estimates Based on a New Measure of Fiscal Shocks,” American Economic Review, vol. 100, no. 3, pp. 763–801, 2010.
Worth mentioning Exce-Lint, an Excel add-in that automatically finds formula errors in spreadsheets.
https://plasma-umass.org/ExceLint-addin/
I wonder if "spreadsheet unit tests" are/should be a thing.
SQL based unit testing exists. So why not i guess?
http://dbunit.sourceforge.net/intro.html
They are a thing [1]. We use MintData spreadsheets for a large part of our regression test suite that tests MintData itself.
A bit meta, but yes, spreadsheets can definitely be used for the equivalent of what "unit tests" are in code.
Interestingly, we also have "integration tests" in our spreadsheets, but this has more to do with the fact that MintData spreadsheets have native API calling ability, so we can test with external services end-to-end.
[1] MintData, https://mintdata.com
Is there an intro plan that isn't $95 / month? I want to pay money for this, but there are alternatives at that pricing level.
Actually even paying $1140 ($95 * 12 months) for the on-premise version is better.
I had a lot of insights generated from this opensource program.
https://medium.com/guesstimate-blog/introducing-guesstimate-...
https://github.com/getguesstimate/guesstimate-app
I wonder how those mistakes were caught - I guess it is hard for them to track since they seem to rely on news reports.
If it is usually some audit firm, seems like it is working somewhat as intended, at least in the business related ones. - in other cases like research it might be important enough to hire an audit firm to verify your spreadsheet model/data provenance.
Where’s he spreadsheet error that underlies the greatest scam of our time, supply side economics? I’ve heard of it, and would love to see it analyzed.
Released in 2018, so needs the relevant tag (I think).
They've been doing this for many years, and I think it's an ongoing effort.