← Back to context

Comment by scrumper

11 years ago

Excel is a perfect example. The core Excel experience is basically functional programming on a virtual machine with a matrix address space laid out visually right in front of you. It even looks like traditional programming if you dive into the VBA stuff, which plenty of non-technical specialists, including MBAs and managers, do on a regular basis in the pursuit of solving their problems.

Any specialist user willing to invest some time in learning their tools can do this. A culture develops around it.

And replying to parent: those efforts around teaching 'civilians' to code are probably misguided. The investment needs to be in adding scripting and programmability into existing line of business tool, not on encouraging people to sit in front of an isolated REPL disconnected from any business value or context.

+1; many companies have tons of internal processes which rely on Excel sheets. When these become painful enough, another team (internal applications) can come in, evaluate the situation, and build out a custom solution which uses an actual database, but Excel still provides a ton of value, since at the most basic level it's a database table with no validation and a free-form schema.

The downside is, when all you have are Excel sheets, everything looks like rows and columns (and not e.g. objects with behaviors). If Excel had more robust import/export mechanisms that normal users could understand (e.g. built-in REST client with JSON + XML serializers + many database adapters w/ lots of helpful wizards or tools to guide you), it'd be way more powerful. Then again, if someone is at the point where they'd be able to look at some JSON and compare it with their spreadsheet and be able to describe the mappings, they're possibly better off going to some training sessions on ${your favorite programming language} to learn how to do this the easy way.

  • Nicely put. It's actually rather agile: Excel becomes a prototyping tool to allow the business users to describe what a solution looks like, helping to guide development of that custom solution. Sadly few IT organizations are confident enough to trust their users and work like this, instead starting from zero with a pedantic requirements gathering process before building something less flexible and useful. The problem is partly a lack of domain knowledge in the internal apps team (which is understandable), and partly a kind of technology-first arrogance which prevents that team from making use of the intellectual capital originated by the business in their spreadsheets and processes (which is inexcusable really).

    Ideally, an organization comes to understand that Excel is a fantastic tool at the frontier where the business needs to adapt rapidly, but once a process is fixed, replacing it with a fixed system is worth the tradeoff in reduced operational risk.

    To your second para., much of that falls to internal apps to provide decent RESTful APIs across their systems. Some companies are doing this, in the process getting to a point where the Excel frontier is just analyzing and reporting on data, not acting as a source in its own right. Then you have traceability for every data point in the organization, and you're in a pretty sweet spot operationally.

Thanks to John Foreman's book "Data Smart" I now regularly test small-scale machine learning problems in Excel. Sure, a developer will translate the end result to scikit-learn or AzureML, but this is stuff that's easily available to even mildly-technical enthusiasts.