Posts Tagged “data”
Despite decades of reform efforts, mathematics teaching in the U.S. has changed little in the last century. As a result, it seems, American students have been left behind, now ranking 40th in the world in math literacy.
Several state and national reform efforts have tried to improve things. The most recent Common Core standards had a great deal of promise with their focus on how to teach mathematics, but after several years, changes in teaching practices have been minimal.
As an education researcher, I’ve observed teachers trying to implement reforms – often with limited success. They sometimes make changes that are more cosmetic than substantive (e.g., more student discussion and group activity), while failing to get at the heart of the matter: What does it truly mean to teach and learn mathematics?
Traditional mathematics teaching
Traditional middle or high school mathematics teaching in the U.S. typically follows this pattern: The teacher demonstrates a set of procedures that can be used to solve a particular kind of problem. A similar problem is then introduced for the class to solve together. Then, the students get a number of exercises to practice on their own.
For example, when students learn about the area of shapes, they’re given a set of formulas. They put numbers into the correct formula and compute a solution. More complex questions might give the students the area and have them work backwards to find a missing dimension. Students will often learn a different set of formulas each day: perhaps squares and rectangles one day, triangles the next.
Students in these kinds of lessons are learning to follow a rote process to arrive at a solution. This kind of instruction is so common that it’s seldom even questioned. After all, within a particular lesson, it makes the math seem easier, and students who are successful at getting the right answers find this kind of teaching to be very satisfying.
But it turns out that teaching mathematics this way can actually hinder learning. Children can become dependent on tricks and rules that don’t hold true in all situations, making it harder to adapt their knowledge to new situations.
For example, in traditional teaching, children learn that they should distribute a number by multiplying across parentheses and will practice doing so with numerous examples. When they begin learning how to solve equations, they often have trouble realizing that it’s not always needed. To illustrate, take the equation 3(x + 5) = 30. Children are likely to multiply the 3 across the parentheses to make 3x + 15 = 30. They might just as easily have divided both sides by 3 to make x + 5 = 10, but a child who learned the distribution method might have great difficulty recognizing the alternate method – or even that both procedures are equally correct.
More than a right answer
A key missing ingredient in these traditional lessons is conceptual understanding.
Concepts are ideas, meaning and relationships. It’s not just about knowing the procedure (like how to compute the area of a triangle) but also the significance behind the procedure (like what area means). How concepts and procedures are related is important as well, such as how the area of a triangle can be considered half the area of a rectangle and how that relationship can be seen in their area formulas.
Teaching for conceptual understanding has several benefits. Less information has to be memorized, and students can translate their knowledge to new situations more easily. For example, understanding what area means and how areas of different shapes are related can help students understand the concept of volume better. And learning the relationship between area and volume can help students understand how to interpret what the volume means once it’s been calculated.
In short, building relationships between how to solve a problem and why it’s solved that way helps students use what they already know to solve new problems that they face. Students with a truly conceptual understanding can see how methods emerged from multiple interconnected ideas; their relationship to the solution goes deeper than rote drilling.
Teaching this way is a critical first step if students are to begin recognizing mathematics as meaningful. Conceptual understanding is a key ingredient to helping people think mathematically and use mathematics outside of a classroom.
The will to change
Conceptual understanding in mathematics has been recognized as important for over a century and widely discussed for decades. So why has it not been incorporated into the curriculum, and why does traditional teaching abound?
Learning conceptually can take longer and be more difficult than just presenting formulas. Teaching this way may require additional time commitments both in and outside the classroom. Students may have never been asked to think this way before.
There are systemic obstacles to face as well. A new teacher may face pressure from fellow teachers who teach in traditional ways. The culture of overtesting in the last two decades means that students face more pressure than ever to get right answers on tests.
The results of these tests are also being tied to teacher evaluation systems. Many teachers feel pressure to teach to the test, drilling students so that they can regurgitate information accurately.
If we really want to improve America’s mathematics education, we need to rethink both our education system and our teaching methods, and perhaps to consider how other countries approach mathematics instruction. Research has provided evidence that teaching conceptually has benefits not offered by traditional teaching. And students who learn conceptually typically do as well or better on achievement tests.
We prepare children to learn how to learn, not how to take a test.
Big Data is having a major effect on how we hear music.
Fifteen years ago, Steve Jobs introduced the iPod. Since then, most music fans have understood this has radically changed how they listen to music.
Less understood are the ways that raw information – accumulated via downloads, apps and online searches – is influencing not only what songs are marketed and sold, but which songs become hits.
Decisions about how to market and sell music, to some extent, still hinge upon subjective assumptions about what sounds good to an executive, or which artists might be easier to market. Increasingly, however, businesses are turning to big data and the analytics that can help turn this information into actions.
Big data is a term that reflects the amount of information people generate – and it’s a lot. Some estimate that today, humans generate more information in one minute than in every moment from the earliest historical record through 2000.
Unsurprisingly, harnessing this data has shaped the music industry in radical new ways.
When it was all about the charts
In the 20th century, decisions about how to market and sell music were based upon assumptions about who would buy it or how they would hear it.
At times, purely subjective assumptions would guide major decisions. Some producers, like Phil Spector and Don Kirshner, earned reputations for their “golden ears” – their ability to intuit what people would want to listen to before they heard it. (If you aren’t aware of the SNL parody of this phenomenon, take a second to see “More Cowbell.”) Eventually, record companies incorporated more market-based objective information through focus groups, along with sheet music and record sales.
But the gold standard of information in the music industry became the “charts,” which track the comparative success of one recording against others.
Music charts have typically combined two pieces of information: what people are listening to (radio, jukeboxes and, today, streaming) and what records they’re buying.
Charts like the Billboard Hot 100 measure the exposure of a recording. If a song is in the first position on a list of pop songs, the presumption is that it’s the most popular – the most-played song on the radio, or the most-purchased in record stores. In the 1920s through the 1950s, when record charts began to appear in Billboard, they were compiled from sales information provided by select shops where records were sold. The number of times a recording played on the radio began to be incorporated into the charts in the 1950s.
While charts attempt to be objective, they don’t always capture musical tastes and listening habits. For example, in the 1950s, artists started appearing on multiple charts presumed to be distinct. When Chuck Berry made a recording of “Maybellene” that simultaneously appeared in the country and western, rhythm and blues, and pop charts, it upended certain assumptions that undergirded the music industry – specifically, that the marketplace was as segregated as the United States. Simply put, the industry assumed that pop and country were Caucasian, while R&B was African-American. Recordings like “Maybellene” and other “crossover” hits signaled that subjective tastes weren’t being accurately measured.
In the 1990s, chart information incorporated better data, with charts automatically being tracked via scans at record stores. Once sales data began to be accumulated across all stores using Nielsen Soundscan, some larger assumptions about what people were listening to were challenged. The best-selling recordings in the early 1990s were often country and hip-hop records, even though America’s radio stations during the 1980s had tended to privilege classic rock.
Record charts are constantly evolving. Billboard magazine has the longest-running series of charts evaluating different genres and styles of music, and so it makes a good standard for comparison. Yet new technology has made this system a bit problematic. For example, data generated from Pandora weren’t added to the Billboard charts until January of this year.
The end of genre?
Today, companies are trying to make decisions relying on as few assumptions as possible. Whereas in the past, the industry relied primarily on sales and how often a songs were played on the radio, they can now see what specific songs people are listening to, where they are hearing it and how they are consuming it.
On a daily basis, people generate 2.5 exabytes of data, which is the equivalent to 250,000 times all of the books in the Library of Congress. Obviously, not all of this data is useful to the music industry. But analytical software can utilize some of it to help the music industry understand the market.
The Musical Genome, the algorithm behind Pandora, sifts through 450 pieces of information about the sound of a recording. For example, a song might feature the drums as being one of the loudest components of the sound, compared to other features of the recording. That measurement is a piece of data that can be incorporated into the larger model. Pandora uses these data to help listeners find music that is similar in sound to what they have enjoyed in the past.
This approach upends the 20th-century assumptions of genre. For example, a genre such as classic rock can become monolithic and exclusionary. Subjective decisions about what is and isn’t “rock” have historically been sexist and racist.
With Pandora, the sound of a recording becomes much more influential. Genre is only one of 450 pieces of information that’s being used to classify a song, so if it sounds like 75 percent of rock songs, then it likely counts as rock.
Meanwhile, Shazam began as an idea that turned sound into data. The smartphone app takes an acoustic fingerprint of song’s sound to reveal the artist, song title and album title of the recording. When a user holds his phone toward a speaker playing a recording, he quickly learns what he is hearing.
The listening habits of Shazam’s 120 million active users can be viewed in real time, by geographic location. The music industry now can learn how many people, when they heard a particular song, wanted to know the name of the singer and artist. It gives real-time data that can shape decisions about how – and to whom – songs are marketed, using the preferences of the listeners. Derek Thompson, a journalist who has examined data’s affects on the music industry, has suggested that Shazam has shifted the power of deciding hits from the industry to the wisdom of a crowd.
The idea of converting a recording’s sound into data has also led to a different way of interpreting this information.
If we know the “sound” of past hits – the interaction between melody, rhythm, harmony, timbre and lyrics – is it possible to predict what the next big hit will be? Companies like Music Intelligence Solutions, Inc., with its software Uplaya, will compare a new recording to older recordings to predict success. The University of Antwerp in Belgium conducted a study on dance songs to create a model that had a 70 percent likelihood of predicting a hit.
What happens next?
Even as new information becomes available, old models still help us organize that information. Billboard Magazine now has a Social 50 chart which tracks the artists most actively mentioned on the world’s leading social media sites.
In a way, social media can be thought of as analogous to the small musical scenes of the 20th century, like New York City’s CBGB or Seattle’s Sub Pop scene. In Facebook groups or on Twitter lists, some dedicated and like-minded fans are talking about the music they enjoy – and record companies want to listen. They’re able to follow how the “next big thing” is being voraciously discussed within a growing and devoted circle of fans.
Streaming music services are increasingly focused upon how social media is intertwined with the listening experience. The Social 50 chart is derived from information gathered by the company Next Big Sound, which is now owned by Pandora. In 2015, Spotify acquired the music analytics firm The Echo Nest, while Apple Music acquired Semetric.
Songwriters and distributors now know – more than ever – how people listen to music and which sounds they seem to prefer.
But did people like OMI’s 2015 hit “Cheerleader” because of its sound and its buzz on social media – as Next Big Sound predicted? Or did it spread on these networks only because it possessed many of the traits of a successful record?
Does taste even matter? You’d like to think you listen to what you enjoy, not what the industry predicts you’ll like based on data. But is your taste your own? Or will the feedback loop – where what you’ve enjoyed in the past shapes what you hear today – change what you’ll like in the future?
What do you do if a border official asks for your phone PIN?
Photo Courtesy of Ervins Strauhmanis/Flickr, CC BY-SA
Author: Paul Ralph
On January 30 – three days after US President Donald Trump signed an executive order restricting immigration from several predominantly Muslim countries – an American scientist employed by NASA was detained at the US border until he relinquished his phone and PIN to border agents. Travelers are also reporting border agents reviewing their Facebook feeds, while the Department of Homeland Security considers requiring social media passwords as a condition of entry.
Intimidating travelers into revealing passwords is a much greater invasion of privacy than inspecting their belongings for contraband.
Technology pundits have already recommended steps to prevent privacy intrusion at the US border, including leaving your phone at home, encrypting your hard drive and enabling two-factor authentication. However, these steps only apply to US citizens. Visitors need a totally different strategy to protect their private information.
Giving border agents access to your devices and accounts is problematic for three reasons:
1) It violates the privacy of not only you but also your friends, family, colleagues and anyone else who has shared private messages, pictures, videos or data with you.
2) Doctors, lawyers, scientists, government officials and many business people’s devices contain sensitive data. For example, your lawyer might be carrying documents subject to attorney-client privilege. Providing such privileged information to border agents may be illegal.
This problem cannot be solved through normal cybersecurity countermeasures.
Encryption, passwords and two-factor authentication are useless if someone intimidates you into revealing your passwords. Leaving your devices at home or securely wiping them before traveling is ineffective if all of your data is in the cloud and accessible from any device. What do you do if border agents simply ask for your Facebook password?
And leaving your phone at home, wiping your devices and deactivating your social media will only increase suspicion.
What you can do
First, recognize that lying to a border agent (including giving them fake accounts) or obstructing their investigation will land you in serious trouble and that agents have sweeping power to deny entry to the US. So you need a strategy where you can fully cooperate without disclosing private data or acting suspiciously.
Second, recognize that there are two distinct threats:
1) Border agents extracting private or sensitive data from devices (phone, tablet, laptop, camera, USB drive, SIM card, etc.) that you are carrying.
2) Border agents compelling you to disclose your passwords or extracting your passwords from your devices.
Protecting your devices
To protect your privacy when traveling, here’s what you can do.
First, use a cloud-based service such as Dropbox, Google Drive, OneDrive or Box.com to backup all of your data. Use another service like Boxcryptor, Cryptomator or Sookasa to protect your data such that neither the storage provider nor government agencies can read it. While these services are not foolproof, they significantly increase the difficulty of accessing your data.
Next, cross the border with no or clean devices. Legally-purchased entertainment should be fine, but do not sync your contacts, calendar, email, social media apps, or anything that requires a password.
If a border agent asks you to unlock your device, simply do so and hand it over. There should be nothing for them to find. You can access your data from the cloud at your destination.
Protecting your cloud data
However, border agents do not need your device to access your online accounts. What happens if they simply demand your login credentials? Protecting your cloud data requires a more sophisticated strategy.
First, add all of your passwords to a password manager such as LastPass, KeePass or Dashlane. While you’re at it, change any passwords that are easy to guess, easy to remember or are duplicates.
Before leaving home, generate a new master password for your password manager that is difficult to guess and difficult to remember. Give the password to a trusted third party such as your spouse or IT manager. Instruct him or her not to provide the password until you call from your destination. (Don’t forget to memorize their phone number!)
If asked, you can now honestly say that you don’t know or have access to any of your passwords. If pressed, you can explain that your passwords are stored in a password vault precisely so that you cannot be compelled to divulge them, if, for example, you were abducted while traveling.
This may sound pretty suspicious, but we’re not done.
Raise the issue at your workplace. Emphasize the risks of leaking trade secrets or sensitive, protected or legally privileged data about customers, employees, strategy or research while traveling.
Encourage your organization to develop a policy of holding passwords for traveling employees and lending out secure travel-only devices. Make the policy official, print it and bring it with you when you travel.
Now if border agents demand passwords, you don’t know them, and if they demand you explain how you can not know your own passwords, you can show them your organisation’s policy.
This may all seem like an instruction manual for criminals, but actual criminals will likely just create fake accounts. Rather, I believe it’s important to provide this advice to those who have done nothing illegal but who value their privacy in the face of intrusive government security measures.
Paul Ralph, Senior Lecturer in Computer Science