Stephen Wolfram

We’ve Come a Long Way in 30 Years (But You Haven’t Seen Anything Yet!)

June 21, 2018 — Stephen Wolfram

30 years of Mathematica

Technology for the Long Term

On June 23 we celebrate the 30th anniversary of the launch of Mathematica. Most software from 30 years ago is now long gone. But not Mathematica. In fact, it feels in many ways like even after 30 years, we’re really just getting started. Our mission has always been a big one: to make the world as computable as possible, and to add a layer of computational intelligence to everything.

Our first big application area was math (hence the name “Mathematica”). And we’ve kept pushing the frontiers of what’s possible with math. But over the past 30 years, we’ve been able to build on the framework that we defined in Mathematica 1.0 to create the whole edifice of computational capabilities that we now call the Wolfram Language—and that corresponds to Mathematica as it is today.

From when I first began to design Mathematica, my goal was to create a system that would stand the test of time, and would provide the foundation to fill out my vision for the future of computation. It’s exciting to see how well it’s all worked out. My original core concepts of language design continue to infuse everything we do. And over the years we’ve been able to just keep building and building on what’s already there, to create a taller and taller tower of carefully integrated capabilities.

It’s fun today to launch Mathematica 1.0 on an old computer, and compare it with today:

Older Mac versus iPhone

CONTINUE READING »


Sebastian Bodenstein
Matteo Salvarezza
Meghan Rieu-Werden
Taliesin Beynon

Launching the Wolfram Neural Net Repository

June 14, 2018
Sebastian Bodenstein, Senior Developer, Advanced Research Group
Matteo Salvarezza, Developer, Advanced Research Group
Meghan Rieu-Werden, Data Manager, Advanced Research Group
Taliesin Beynon, Lead Developer, Advanced Research Group

Hero

Today, we are excited to announce the official launch of the Wolfram Neural Net Repository! A huge amount of work has gone into training or converting around 70 neural net models that now live in the repository, and can be accessed programmatically in the Wolfram Language via NetModel:

net = NetModel

net = NetModel["ResNet-101 Trained on ImageNet Competition Data"]

Peacock Input

net[]

Peacock Output

Neural nets have generated a lot of interest recently, and rightly so: they form the basis for state-of-the-art solutions to a dizzying array of problems, from speech recognition to machine translation, from autonomous driving to playing Go. Fortunately, the Wolfram Language now has a state-of-the-art neural net framework (and a growing tutorial collection). This has made possible a whole new set of Wolfram Language functions, such as FindTextualAnswer, ImageIdentify, ImageRestyle and FacialFeatures. And deep learning will no doubt play an important role in our continuing mission to make human knowledge computable.

CONTINUE READING »


Sjoerd Smit

How Optimistic Do You Want to Be? Bayesian Neural Network Regression with Prediction Errors

May 31, 2018 — Sjoerd Smit, Technical Consultant

Neural networks are very well known for their uses in machine learning, but can be used as well in other, more specialized topics, like regression. Many people would probably first associate regression with statistics, but let me show you the ways in which neural networks can be helpful in this field. They are especially useful if the data you’re interested in doesn’t follow an obvious underlying trend you can exploit, like in polynomial regression.

In a sense, you can view neural network regression as a kind of intermediary solution between true regression (where you have a fixed probabilistic model with some underlying parameters you need to find) and interpolation (where your goal is mostly to draw an eye-pleasing line between your data points). Neural networks can get you something from both worlds: the flexibility of interpolation and the ability to produce predictions with error bars like when you do regression.

Bayesian Neural Nets

CONTINUE READING »


Carlo Giacometti

Learning to Listen: Neural Networks Application for Recognizing Speech

May 24, 2018 — Carlo Giacometti, Kernel Developer, Algorithms R&D

Introduction

Recognizing words is one of the simplest tasks a human can do, yet it has proven extremely difficult for machines to achieve similar levels of performance. Things have changed dramatically with the ubiquity of machine learning and neural networks, though: the performance achieved by modern techniques is dramatically higher compared with the results from just a few years ago. In this post, I’m excited to show a reduced but practical and educational version of the speech recognition problem—the assumption is that we’ll consider only a limited set of words. This has two main advantages: first of all, we have easy access to a dataset through the Wolfram Data Repository (the Spoken Digit Commands dataset), and, maybe most importantly, all of the classifiers/networks I’ll present can be trained in a reasonable time on a laptop.

It’s been about two years since the initial introduction of the Audio object into the Wolfram Language, and we are thrilled to see so many interesting applications of it. One of the main additions to Version 11.3 of the Wolfram Language was tight integration of Audio objects into our machine learning and neural net framework, and this will be a cornerstone in all of the examples I’ll be showing today.

Without further ado, let’s squeeze out as much information as possible from the Spoken Digit Commands dataset!

Spoken Digit Commands dataset

CONTINUE READING »


Michael Trott

Strange Circles in the Complex Plane—More Experimental Mathematics Results

May 10, 2018 — Michael Trott, Chief Scientist

The Shape of the Differences of the Complex Zeros of Three-Term Exponential Polynomials

In my last blog, I looked at the distribution of the distances of the real zeros of functions of the form with incommensurate , . And after analyzing the real case, I now want to have a look at the differences of the zeros of three-term exponential polynomials of the form for real , , . (While we could rescale to set and for the zero set , keeping and will make the resulting formulas look more symmetric.) Looking at the zeros in the complex plane, one does not see any obvious pattern. But by forming differences of pairs of zeros, regularities and patterns emerge, which often give some deeper insight into a problem. We do not make any special assumptions about the incommensurability of , , .

The differences of the zeros of this type of function are all located on oval-shaped curves. We will find a closed form for these ovals. Using experimental mathematics techniques, we will show that ovals are described by the solutions of the following equation:


… where:

CONTINUE READING »

Posted in: Mathematics

Melanie Moore

Experience Innovation and Insight at the 2018 Wolfram Technology Conference

May 3, 2018 — Melanie Moore, Communications Project Manager

Join us October 16–19, 2018, for four days of hands-on training, workshops, talks and networking with creators, experts and enthusiasts of Wolfram technology. We’ll kick off on Tuesday, October 16, with a keynote address by Wolfram founder and CEO Stephen Wolfram.

CONTINUE READING »

Posted in: Events

Michael Trott

A Tale of Three Cosines—An Experimental Mathematics Adventure

April 24, 2018 — Michael Trott, Chief Scientist

Identifying Peaks in Distributions of Zeros and Extrema of Almost-Periodic Functions: Inspired by Answering a MathOverflow Question

One of the Holy Grails of mathematics is the Riemann zeta function, especially its zeros. One representation of is the infinite sum . In the last few years, the interest in partial sums of such infinite sums and their zeros has grown. A single cosine or sine function is periodic, and the distribution of its zeros is straightforward to describe. A sum of two cosine functions can be written as a product of two cosines, . Similarly, a sum of two sine functions can be written as a product of . This reduces the zero-finding of a sum of two cosines or sines to the case of a single one. A sum of three cosine or sine functions, , is already much more interesting.

Fifteen years ago, in the notes to chapter 4 of Stephen Wolfram’s A New Kind of Science, a log plot of the distribution of the zero distances…

A New Kind of Science, notes from Chapter 4

… of the zero distribution of —showing characteristic peaks—was shown.

CONTINUE READING »

Posted in: Mathematics

Joanna Crown

Five Ways to Make Your Technical Presentations Awesome

April 19, 2018 — Joanna Crown, Strategic Projects

“Tell me and I forget. Teach me and I remember. Involve me and I learn.” — Benjamin Franklin

I can count on one hand the best presentations I have ever experienced, the most recent being my university dynamics lecturer bringing out his electric guitar at the end of term to demonstrate sound waves; a pharmaceutical CEO giving an impassioned after-dinner oration about how his love of music influenced his business decisions; and last but not least, my award-winning attempt at explaining quantum entanglement using a marble run and a cardboard box (I won a bottle of wine).

It’s perhaps equally easy to recall all the worst presentations I’ve experienced as well—for example, too many PowerPoint presentations crammed full of more bullet points than a shooting target; infinitesimally small text that only Superman’s telescopic vision could handle; presenters intent on slowly reading every word that they’ve squeezed onto a screen and thoroughly missing the point of a presentation: that of succinctly communicating interesting ideas to an audience.

CONTINUE READING »


Cat Frazier

Announcing Wolfram Presenter Tools

April 17, 2018 — Cat Frazier, Project Manager, Wolfram Blog

Introducing the Ultimate Technical Presentation Environment with Live Interactivity

We are delighted to announce that Wolfram’s latest comprehensive notebook technology extension is here. Released with Version 11.3 of Wolfram desktop products, Wolfram Presenter Tools is the world’s first fully computational presentation environment, seamlessly extending the notebook workflow for easy creation and delivery of dynamic presentations and slide shows, automatically scaled to fit any screen size. Our unique presentation features include rapid stylesheet updating and automatic slide breaking based on cell style.

CONTINUE READING »


Stephen Wolfram

Launching the Wolfram Challenges Site

April 12, 2018 — Stephen Wolfram

Wolfram Challenges

The more one does computational thinking, the better one gets at it. And today we’re launching the Wolfram Challenges site to give everyone a source of bite-sized computational thinking challenges based on the Wolfram Language. Use them to learn. Use them to stay sharp. Use them to prove how great you are.

The Challenges typically have the form: “Write a function to do X”. But because we’re using the Wolfram Language—with all its built-in computational intelligence—it’s easy to make the X be remarkably sophisticated.

The site has a range of levels of Challenges. Some are good for beginners, while others will require serious effort even for experienced programmers and computational thinkers. Typically each Challenge has at least some known solution that’s at most a few lines of Wolfram Language code. But what are those lines of code?

CONTINUE READING »