News, Views & Insights
Wolfram Language
Thrust Supersonic Car Engineering Insights: Applying Multiparadigm Data Science
Having a really broad toolset and an open mind on how to approach data can lead to interesting insights that are missed when data is looked at only through the lens of statistics or machine learning. It’s something we at Wolfram Research call multiparadigm data science, which I use here for a small excursion through calculus, graph theory, signal processing, optimization and statistics to gain some interesting insights into the engineering of supersonic cars.
Cleaning and Structuring Large Datasets: Web Scraping with the Wolfram Language, Part 2
In my previous post, I demonstrated the first step of a multiparadigm data science workflow: extracting data. Now it's time to take a closer look at how the Wolfram Language can help make sense of that data by cleaning it, sorting it and structuring it for your workflow. I'll discuss key Wolfram Language functions for making imported data easier to browse, query and compute with, as well as share some strategies for automating the process of importing and structuring data. Throughout this post, I'll refer to the US Election Atlas website, which contains tables of US presidential election results for given years:
The 2018 Wolfram Summer School: A Recap
Former Astronaut Creates Virtual Copilot with Wolfram Neural Nets and a Raspberry Pi
Big O and Friends: Tales of the Big, the Small and Every Scale in Between
One of the many beautiful aspects of mathematics is that often, things that look radically different are in fact the same—or at least share a common core. On their faces, algorithm analysis, function approximation and number theory seem radically different. After all, the first is about computer programs, the second is about smooth functions and the third is about whole numbers. However, they share a common toolset: asymptotic relations and the important concept of asymptotic scale.
By comparing the “important parts” of two functions—a common trick in mathematics—asymptotic analysis classifies functions based on the relative size of their absolute values near a particular point. Depending on the application, this comparison provides quantitative answers to questions such as “Which of these algorithms is fastest?” or “Is function a good approximation to function g?”. Version 11.3 of the Wolfram Language introduces six of these relations, summarized in the following table.
Getting to the Point: Asymptotic Expansions in the Wolfram Language
We’ve Come a Long Way in 30 Years (But You Haven’t Seen Anything Yet!)
Technology for the Long Term On June 23 we celebrate the 30th anniversary of the launch of Mathematica. Most software from 30 years ago is now long gone. But not Mathematica. In fact, it feels in many ways like even after 30 years, we’re really just getting started. Our mission has always been a big […]
How Optimistic Do You Want to Be? Bayesian Neural Network Regression with Prediction Errors
Neural networks are very well known for their uses in machine learning, but can be used as well in other, more specialized topics, like regression. Many people would probably first associate regression with statistics, but let me show you the ways in which neural networks can be helpful in this field. They are especially useful if the data you're interested in doesn't follow an obvious underlying trend you can exploit, like in polynomial regression.
In a sense, you can view neural network regression as a kind of intermediary solution between true regression (where you have a fixed probabilistic model with some underlying parameters you need to find) and interpolation (where your goal is mostly to draw an eye-pleasing line between your data points). Neural networks can get you something from both worlds: the flexibility of interpolation and the ability to produce predictions with error bars like when you do regression.