May 28, 2019 — Daniel Lichtblau, Symbolic Algorithms Developer, Algorithms R&D

Did We Really Write What We Said We Wrote?

Several Months Ago…

I wrote a blog post about the disputed Federalist Papers. These were the 12 essays (out of a total of 85) with authorship claimed by both Alexander Hamilton and James Madison. Ever since the landmark statistical study by Mosteller and Wallace published in 1963, the consensus opinion has been that all 12 were written by Madison (the Adair article of 1944, which also takes this position, discusses the long history of competing authorship claims for these essays). The field of work that gave rise to the methods used often goes by the name of “stylometry,” and it lies behind most methods for determining authorship from text alone (that is to say, in the absence of other information such as a physical typewritten or handwritten note). In the case of the disputed essays, the pool size, at just two, is as small as can be. Even so, these essays have been regarded as difficult for authorship attribution due to many statistical similarities in style shared by Hamilton and Madison.

Read More »


May 23, 2019 — Brian Wood, Lead Technical Marketing Writer, Document and Media Systems

Just as Wolfram was doing AI before it was cool, so have we been doing data science since before it was mainstream. A prime example is the creation of Wolfram|Alpha—a massive project that involved engineering, modeling, analyzing, visualizing and interfacing with terabytes of data, developing a natural language interface, and deploying results in a sensible way. Wolfram|Alpha itself is a tool for doing data science, and its continued success is largely because of the underlying strategy we used to build it: a multiparadigm approach driven by natural curiosity, exploring all kinds of data, using advanced methods from a range of areas and automating as much as possible.

Any approach to data science can only be as effective as the computational tools driving it; luckily for us, we had the Wolfram Language at our disposal. Leveraging its universal symbolic representation, high-level automation and human readability—as well as its broad range of built-in computation, knowledge and interfaces—streamlined our process to help bring Wolfram|Alpha to fruition. In this post, I’ll discuss some key tenets of the multiparadigm approach, then demonstrate how they combine with the computational intelligence of the Wolfram Language to make the ideal workflow for not only discovering and presenting insights from your data, but also for creating scalable, reusable applications that optimize your data science processes.

Doing Data Science Better with Wolfram and the Multiparadigm Approach

Read More »


May 9, 2019 — Stephen Wolfram

Wolfie

What Kind of a Thing Is the Wolfram Language?

I’ve sometimes found it a bit of a struggle to explain what the Wolfram Language really is. Yes, it’s a computer language—a programming language. And it does—in a uniquely productive way, I might add—what standard programming languages do. But that’s only a very small part of the story. And what I’ve finally come to realize is that one should actually think of the Wolfram Language as an entirely different—and new—kind of thing: what one can call a computational language.

So what is a computational language? It’s a language for expressing things in a computational way—and for capturing computational ways of thinking about things. It’s not just a language for telling computers what to do. It’s a language that both computers and humans can use to represent computational ways of thinking about things. It’s a language that puts into concrete form a computational view of everything. It’s a language that lets one use the computational paradigm as a framework for formulating and organizing one’s thoughts.

It’s only recently that I’ve begun to properly internalize just how broad the implications of having a computational language really are—even though, ironically, I’ve spent much of my life engaged precisely in the consuming task of building the world’s only large-scale computational language.

Read More »


April 26, 2019 — Tim Shedelbower, Visualization Developer, Algorithms R&D

Connect the dots. It was exciting to draw from number to number until the sudden discovery of a hidden cartoon. That was my inadvertent introduction to graph theory very early in school. Little did I know adults used the same concept to discover hidden patterns to solve problems, such as proving that a single crossing of seven Königsberg bridges to four land masses is not possible, but coloring a map distinctly with four colors is. These problems inspired the methods we know today as graph theory. And in honor of the work of late mathematician and connect-the-dot author Elwyn Berlekamp, we see how sophisticated this “child’s play” can be by examining the different styles and themes we can apply to graphs.

Connect the dots with Graph and PlotTheme

Read More »


January 17, 2019 — Kathy Bautista, Senior Sales Initiatives Manager, Academic Sales

As many teachers make the transition back into classes after the holidays, quite a few have plans to update lessons to include segments that introduce data science concepts. Why, you ask?

According to a LinkedIn report published last week, the most promising job in the US in 2019 is data scientist. And if you search for the top “hard skills” needed for 2019, data science is often in the top 10.

Data science, applied computation, predictive analytics… no matter what you call it, in a nutshell it’s gathering insight from data through analysis and knowing what questions to ask to get the right answers. As technology continues to advance, the career landscape also continues to evolve with a greater emphasis on data—so data science has quickly become an essential skill that’s popping up in all sorts of careers, including engineering, business, astronomy, athletics, marketing, economics, farming, meteorology, urban planning, sociology and nursing.

Teacher Resources for Introducing Computational Thinking and Data Science

Read More »


January 10, 2019 — Brian Wood, Lead Technical Marketing Writer, Document and Media Systems

So far in this series, I’ve covered the process of extracting, cleaning and structuring data from a website. So what does one do with a structured dataset? Continuing with the Election Atlas data from the previous post, this final entry will talk about how to store your scraped data permanently and deploy results to the web for universal access and sharing.

Deploying and Sharing with the Wolfram Language

Read More »


January 3, 2019 — Wolfram Blog Team

Mark Greenberg is a retired educator and contributor to the Tech-Based Teaching blog, which explores the intersections between computational thinking, edtech and learning. He recounts his experience adapting old game code using the Wolfram Language and deployment through the Wolfram Cloud.

Chicken Scratch is an academic trivia game that I originally coded about 20 years ago. At the time I was the Academic Decathlon coach of a large urban high school, and I needed a fun way for my students to remember thousands of factoids for the Academic Decathlon competitions. The game turned out to be beneficial to our team, and so popular that other teams asked to buy it from us. I refreshed the questions each year and continued holding Chicken Scratch tournaments at the next two schools I worked in.

Chicken Scratch

Read More »


December 13, 2018 — Jesika Brooks, Blog Editor - EduTech, Public Relations

A version of this post was originally published on the Tech-Based Teaching blog as “Computational Lesson-Planning: Easy Ways to Introduce Computational Thinking into Your Lessons.” Tech-Based Teaching explores the intersections between computational thinking, edtech and learning.

Sometimes a syllabus is set in stone. You’ve got to cover X, Y and Z, and no amount of reworking or shifting assignments around can change that. Other factors can play a role too: limited time, limited resources or even a bit of nervousness at trying something new.

But what if you’d like to introduce some new ideas into your lessons—ideas like digital citizenship or computational thinking? Introducing computational thinking to fields that are not traditionally part of STEM can sometimes be a challenge, so feel free to share this journey with your children’s teachers, friends and colleagues.

The computational classroom

Read More »


December 6, 2018 — Tuseeta Banerjee, Research Scientist, Machine Learning

Julian Francis, a longtime user of the Wolfram Language, contacted us with a potential submission for the Wolfram Neural Net Repository. The Wolfram Neural Net Repository consists of models that researchers at Wolfram have either trained in house or converted from the original code source, curated, thoroughly tested and finally have rendered the output in a very rich computable knowledge format. Julian was our very first user to go through the process of converting and testing the nets.

We thought it would be interesting to interview him on the entire process of converting the models for the repository so that he could share his experiences and future plans to inspire others.

Read More »


November 20, 2018 — Brian Wood, Lead Technical Marketing Writer, Document and Media Systems

Thanks to the Wolfram Language, English teacher Peter Nilsson is empowering his students with computational methods in literature, history, geography and a range of other non-STEM fields. Working with a group of other teachers at Deerfield Academy, he developed Distant Reading: an innovative course for introducing high-level digital humanities concepts to high-school students. Throughout the course, students learn in-demand coding skills and data science techniques while also finding creative ways to apply computational thinking to real-world topics that interest them.

In this video, Nilsson describes how the built-in knowledge, broad subject coverage and intuitive coding workflow of the Wolfram Language were crucial to the success of his course:

Read More »