Not Qualified

By | Uncategorized | No Comments

Neither Clinton nor Trump is qualified. Certainly, Gary Johnson isn’t.

While we can debate the relative merits of the three candidates’ fitness to be our nation’s next Commander in Chief, what’s abundantly clear following in the wake of the first presidential debate, not to mention some disturbingly ignorant comments by Libertarian candidate Johnson, is that none of the candidates is even remotely qualified to assume the mantle of TIC…Technology-in-Chief. As billionaire businessman and Dallas Mavericks Owner Mark Cuban said recently of Mrs. Clinton, “…she just doesn’t understand technology, and she admits that she doesn’t understand technology, and to me that’s a huge negative, because in this day and age, I think wars are going to be fought, more by bytes, and more by cyber terrorism, than they are going to be fought by bombs and bullets, and if you don’t understand that, it’s going to be very difficult as Commander-in-Chief.” Cuban added that he didn’t think either of two major party candidates was “technologically literate.”

In the first presidential debate, Hillary Clinton called on tech companies to help prevent ISIS from using the Internet to radicalize people and direct its followers. Trump claimed that even though the U.S. developed the Internet, ISIS is “beating us at our own game.” And he then advocated for getting “very, very tough on cyber and cyber warfare… The security aspect of cyber is very, very tough. And maybe it’s hardly doable.” Sadly, amidst all the rhetoric, neither candidate bothered to lay out a coherent technology policy that showed any awareness of how vital technology has become to our nation, its defense, and the lives of its citizens.

The candidates’ poor grasp of technology is concerning, as our next president will govern at least through 2020, face many challenges, and thus needs to set a long-term technology policy.

Of course, no one expects the president to be a cyber warrior or even to set technology policy without expert advisors. But with major issues surrounding cyber security, consumer privacy, government access to digital data, Internet neutrality, and global Internet governance (among many other things), it’s time for the candidates to put aside divisive rhetoric and partisan platitudes, and take a real leadership role on these challenging technology issues and how America can resolve them in a manner that protects ensures our well-being while also encouraging growth and innovation.  

No matter how many “tech” advisors the next president engages, it’s the president who make the final decision. In the words of President Truman “The buck stops here.” So, our next president will need to know enough about technology to evaluate his or her advisors’ recommendations.

In this way, the candidates are no different than a number of CEOs, too many of whom mistakenly believe that technology is beneath them, or that they don’t need to know technology. They have It departments to deal with that. They have businesses to run. But as a our friend Mr. Shakespeare once said, “Tis an unweeded garden that grows to seed things rank and gross in nature.”

Our next president will fail to understand technology at his or her own peril. Consider just one technology challenge awaiting the 45th Commander in Chief:

  • Cybersecurity: After tanking in the Senate in 2012 (Congress failed to pass the bill after it encountered predictable partisan differences over industry regulation), it seems the consistent drumbeat from government officials and an emerging “Cyber Industrial Complex” were enough to get Congress to finally pass the Cybersecurity Information Sharing Act, or CISA, and it’s now officially the law of the land, created, in part, to deal with our nation’s omnipresent vulnerability to malicious hacking. The law enables federal agencies – including the National Security Agency – to share cybersecurity and really any information with private corporations. Though cybersecurity is one of the more pressing issues facing the next president, neither of the two major party presidential candidates have made any concerted effort to outline their positions, perhaps because there is no political benefit to doing so. As Wired Magazine reported, Donald Trump has no official position paper on cybersecurity or privacy, a rather disturbing thought in light of the San Bernardino terrorist’s encrypted iPhone, among many other things.  Trump has made vague reference to China’s “rampant cybercrime” against the US, promised “stronger protections against Chinese hackers” as part of his position paper on the US-China trade relationship, and proposed “closing” parts of the Internet to combat ISIS. However, his cybersecurity positions have been “scattered, misguided, or both,” leading Wired to characterize them as “devoid of context, insight, clarity, or reason.”

Mrs.Clinton, on the other hand, has framed her cybersecurity positions within the context of her broader national security goals. She has also focused on China, saying that she’ll “encourage [them] to be a responsible stakeholder—including on cyberspace, human rights, trade, territorial disputes, and climate change—and hold it accountable if it does not.” Mrs. Clinton has called for a coalition of public and private interests working together to improve our cybersecurity, but avoided commenting on CISA and deflected questions about who was right in the Apple-FBI encryption saga, and instead sought a middle ground, which Wired stated “doesn’t exist” as it pertains to cybersecurity and encryption.

But the fact the candidates don’t seem to care all that much about one of this race’s most important issues doesn’t mean that we shouldn’t. Indeed, more now than ever before, the presidential candidates need to learn, at the bare minimum, enough about technology to acquire a basic understanding of what it can do and what can go wrong. I’m not suggesting that our next president needs to know how to upgrade his or her computer’s memory or remove malware any more than I expect soldiers in battle  to provide health care for patients. But the candidates must set a clear and cogent technology policy and be able to stand up in front of the American people) and explain what needs to be done and why, even if this means hiring someone to tutor them on technology, ideally a trusted consultant with a foot in both the worlds of technology and government who, to borrow a well worn political metaphor, can exist comfortably on both sides of the aisle.

The next president doesn’t need to peruse technical journals, but should be aware of technology trends, regularly asking the chief technology officer to look into an issue and send them a concise summary. The next president doesn’t need to be a hacker, but must learn what hackers can and cannot do, what constitutes cyber-warfare, and what resources the country has to safeguard our technology and networks.

I hope subsequent presidential debates provide more of an opportunity for both major party candidates to clarify their positions on technology and to show how they will lead on an agenda to expand technology opportunities and better protect America from hostile hackers as well as software bugs and malfunctions that put us all at risk.

Image By: Ashton Bingham

The post Not Qualified appeared first on Lloyd Marino.

Source: New feed

Big Data Will Pick Our Next President

By | Uncategorized | No Comments

By Lloyd Marino

Forget the debates. They’re a lot of talk, but change few minds. Speeches and rallies gather the faithful, but bring in few new supporters. Newspapers and TV news are too easily distracted by the latest scandal and much of direct mail is thrown out unread. Instead, it is big data that will elect our next president, at least according to Wired Magazine.

Consequently, both major presidential candidates are striving to tap big data most successfully and learn the most from campaigns’ experience with big data in the election of 2012.

Big Data in 2012 and Today

Big data was a major factor in Obama’s 2012 victory. Obama had a 100 person analytic team to wade through the data and classify every potential voter in swing states by their likelihood of showing up at the polls and voting for Obama. The campaign used that number for data modeling through a SQL Massively Parallel Processing database called Narwhal that could target likely voters with customized messages for their “get out the vote” efforts. They matched potential voters to similar volunteers who would be most persuasive on a person-to-person level.

The Republicans have greatly scaled up their big data operations. In 2012, Orca, their data modeling system crashed on Election Day. This year, Republicans are determined not to make the same mistakes. They have teamed with Deep Roots Analytics to target their television advertising to specific groups. Meanwhile TargetSmart helps Democrats optimize phone calls, fundraising, voter registration, direct mail, social media, and door-to-door canvassing with a national voter profiling database.

How Big Data Helps Campaigns

Both parties monitor social media through big data tools to find people’s concerns and adjust candidates’ stances accordingly. Big data also helps politicians target the best places for voter registration drives, best people to contact to remind them to vote, and best neighborhoods to organize rides to the polls. Using big data, political campaigns can screen out those most likely to vote for opponents so they can focus attention on those who can be swayed. Digital behavioral tracking identifies the voters in swing states who could be convinced to support a candidate and what would be the best way of reaching them. And ad targeting over the Internet ensures that each audience sees the most convincing ad.

Each state compiles a publicly available official voter file with name, gender, birthdate, address, and phone as well as political registration and how frequently the person votes. While the file does not say who the voter selected, even that can be estimated using polling data. Knowing the address, political data experts can supplement this record with census data on average levels of education, income, and race/ethnicity for that zip code, as well as property tax information. Private polling also brings in valuable information. Additional information on interests and concerns can be found on social media. Other data is volunteered by people joining email lists or signing petitions.

Big data also monitors the progress of campaigns. Nate Silver at FiveThirtyEight amalgamates polls, the economy, and historical trends to weigh the odds of victory on Election Day. While polling sometimes seems as much an art as a science, as witness the constant complaints about skewed polls, Nate Silver uses big data techniques to calculate a more accurate model than any one poll can provide.

Big Data and the Future of American Politics

I am confident that in the future Big Data will encompass an even larger role. Right now, candidates use polling to set and adjust their positions. In the future, they can mine social media for issue areas, allowing them to create more nuanced targeted messages and how to adjust their messages for different audiences. It will allow greater monitoring of what potential voters want to hear from a candidate and could supplement poll information on how candidates perform. For instance, currently campaigns and news media have audience response meters to gauge a sample audience’s reactions during a debate and conduct a quick poll afterwards. But big data analytics on twitter feeds produce a much larger sample of opinion. In the future, I expect less polling and more data analytics.

As more data is collected on each of us, political campaigns will be able to do more. If candidates had access to Amazon’s information on what you buy, they could send endorsements from your favorite authors and target content more exactly. Google records on your most searched topics could inform what position papers they send you. And tracking how much you spend could help set suggested amounts when fundraising.

Big data tools improve the efficiency of a campaign at convincing more people to vote for the candidate. Big data now has the power to model individuals’ behavior to target those most likely to respond and the best ways of reaching them. If big data can help businesses convince consumers to buy Pepsi instead of Coke they certainly can sway potential voters to go to the polls and choose a specific candidate.

I don’t generally use my blog to make an appeal. But elections determine the future of our nation. I implore all my readers to go out and vote on November 8th. And don’t forget the races for House, Senate, and state and local positions that may not get as much attention, but are still vital for our well-being.

Image By: Melissa Fox

The post Big Data Will Pick Our Next President appeared first on Lloyd Marino.

Source: New feed

Big Data Battles Global Climate Change

By | Uncategorized | No Comments

By Lloyd Marino

Now that Autumn is here with falling temperatures reminding us that winter is on its way, global warming may not seem like such a hot topic. But remember, when it is winter here, it is summer for half the globe. Global warming needs to be a year-round concern.

August 2016 was the hottest August since reliable records started being kept in 1880. It followed the hottest July, hottest June, etc. going back to May 2015. August ties July 2016 for the hottest month ever. It was 1.8 degrees Fahrenheit hotter than the average for August. While politicians continue to debate, the data show global climate change is real and a growing problem.

Mapping Climate Change

Fortunately, big data can help us fight back against a warming Earth. For instance, big data is helping track the effects of climate change:

  • The Google Earth Engine maps 40 years of satellite imagery to allow researchers to demonstrate the effects of climate change on lakes, shorelines, and global forest coverage. NASA’s Landsat provides the longest continuous global record of the global land surface. It monitors how humans and climate have changed the face of the Earth.
  • Another NASA project, the NASA Center for Climate Simulation uses supercomputers to digest mounds of data to model the future of climate change. It can store 19 petabytes of data.
  • Microsoft’s Madingley project will eventually simulate all ecological processes affecting all life on the land and sea. Researchers are already using it to model the carbon cycle and plan to add more simulations to the “General Ecosystem Model” such as the effect of humans on habitat loss. The model would allow leaders to see the results of policy decisions before they are made in the real world.

Solving Climate Change

Big data also goes beyond tracking the damage to becoming part of the solution to climate change problems. For instance:

  • In 2015, nearly 200 nations agreed to reduce greenhouse gas emissions in the Paris Agreement. The U.S., China, Brazil, and 26 other nations have ratified the agreement. Big data will help measure global climate and ensure that signatories are keeping their promise.
  • The UN’s Big Data Climate Challenge, launched as part of 2014’s Climate Summit challenged the international academic, scientific, technology and policy communities to develop projects to use big data to drive action to help solve global climate change. The winners were the Global Forest Watch that uses satellite data and crowdsourcing to monitor and manage forest resources, and the Site-Specific Agriculture Big Data Team at the International Center for Tropical Agriculture which created a tool to use data on climate and harvests to help farmers make smart decisions about what to plant when that uses big data to predict how climate change has altered traditional wisdom of what crops work best in specific areas.
  • Opower uses big data to compile how much electricity local residents use and sends them reports comparing individuals to their neighbors in order to encourage them to use less. It achieves power savings through inspiring behavior changes. Since 2007, this has saved almost 6 billion kilowatts, enough to power Alaska and Hawaii for a whole year. This cut power plant carbon dioxide emissions by over nine billion pounds.

Additional Steps to Fight Climate Change

But big data can do more. Right now a big problem is policymakers’ reluctance to admit that global climate change exists. Big data can help show the extent of the problem by graphing temperature changes in states and political districts. Politicians who deny global climate change in the abstract will have greater difficulty when confronted with charts showing how it affects major cities under their jurisdiction.

Another danger of climate change is the increase in the sea level from melting ice. Big data can take existing satellite photos of coastal erosion and project them into the future, showing what areas are most likely to be submerged at different temperature levels. This data could help predict where new dams and sea barriers will be needed and where the government should encourage people to move to higher ground. This is a special problem as many major cities are located on the coast—including New York City and Los Angeles—and will be flooded as the waters rise.

I’ve discussed the Svalbard global seed vault before. It preserves over 850,000 seed samples. Researchers can use these seeds to try to make crops weather-resistant and breed in traits that will allow plants to survive on a warmer Earth and grow where nothing else can. Big data can track plant genes and help farmers determine what crops will be most successful under new weather conditions. This will reduce the damage a warmer earth will do to food sources.

Climate change is real and humanity needs to take steps to learn how to survive in a warming world and how to reduce the rate at which the temperature rises. Big data can help us chart the extent of the damage, predict where future problems will arise, and encourage people to reduce the discharge of emissions gases that are aggravating the problem. Big data, and the human ability to innovate with the help of data, are two of our most valuable weapons in the fight against global warming.

Image By: Dawid Małecki

The post Big Data Battles Global Climate Change appeared first on Lloyd Marino.

Source: New feed

Non-Standardized Medicine

By | Uncategorized | No Comments

Standardized Medicine

Henry Ford, who pioneered the assembly line mass production manufacturing system, once said that a customer who bought his Model T car could have it “painted any colour that he wants so long as it is black.”

Today, of course, Ford cars come in a variety of colors and with an extensive range of options beyond Henry Ford’s wildest dreams.

People are not standardized. Walk down the street in most major cities in the U.S. and you’ll see a rainbow of different color, sizes, and shapes.

Businesses recognize this. The hottest trend in restaurants these days is the fast, casual places like Chipotle where you can customize the meal to your taste with hundreds of thousands of combinations.

Yet somehow, when it comes to medicine, differences among people were largely ignored until very recently. Medicine is standardized and doctors follow best practices, at least initially, for everyone. But, since humans are varied, too often the standards of care results in pure trial and error, adding to the time and expense of treatment.

Personalized Medicine

Since the mapping of the human genome in 2003, medicine has been adapting to new knowledge. We now have the ability to analyze an individual’s genetic makeup and customize an appropriate medical treatment. Of course, it is rarely as simple as identifying one specific gene that causes a particular illness. Instead, using pharmacogenomics, researchers can determine how genes affect the body’s reaction to specific conditions and medicines. We know that people with certain chromosomes do better with a particular drug and have no reaction (or a negative one) with another.

According to Nature, the top ten most-used medical drugs only help between a quarter to 0.04 percent of those who use them. Consequently, doctors are increasingly using genetic tests before prescribing medicine. This helps them find more effective treatments with fewer unexpected side effects.

New research is being developed that considers differences among patients and uses genetics to determine the best treatment. This is especially true for cancer and AIDS patients who benefit from better screening, diagnosis, and therapy.

For instance, the Federal $215-million national Precision Medicine Initiative (PMI) will conduct a longitudinal study to create a database of the genetic information on a million volunteers that could be used to customize medicine by determining what types of people react best to individual medicines. This will be especially helpful at helping those with cancer.

Personalized medicine tailors treatment to your disease risks, your genome, and lifestyle habits. As a result of the Human Genome Project, scientists have identified over 1,800 genes that affect diseases and developed 2,000 tests.

Sequencing for Everyone

If your doctor knew more about your individual genetic makeup and your genetic tendency to develop certain conditions, s/he could better advise you on changes to your lifestyle and even prescribe medication to advert harmful conditions.

While sequencing once was very expensive, a company called Illumina has developed a $1,000 sequencing service. As with most new technology, one can expect the price will decrease over time and with greater volume. And, since this is bleeding edge technology, more regulation and oversight is needed to ensure quality service and interpretation of results.

For these reasons, I think we need a government program to sequence everybody’s genetic makeup and link it to one’s medical history. This would build on the PMI program while providing more information for doctors. Big data techniques could then enable doctors to practice better personalized medicine by finding what solutions helped people with similar genetics to their current patient.

Since one’s genetics do not change over time, it makes sense to do this sequencing early in life so the map can guide care through child and adulthood. Knowing more about genetics could help people avoid events that could trigger genes with negative effects.

Of course, there are dangers. Society would need strict rules about privacy to prevent insurance companies from using this data to determine who to insure or to set higher rates for people with certain genes. The 2008 Genetic Information Nondiscrimination Act (GINA) was a good start, limiting employers’ and insurers’ access to genetic information, but more needs to be done to prevent “voluntary”’ release of this data by imposing exorbitant rates on people who do not surrender it.

We would also need strict rules to prevent people from using this data for non-medical purposes, such as profiling people with a genetic predilection towards criminal behavior. There’s also the potential to misuse genetic data to discourage people with certain genes from reproducing.

Still, despite these problems, there are strong advantages to universal sequencing. Big data techniques grow more effective with larger pools of data, so more entries will allow better matches to an individual’s profile. Universal sequencing will lower costs and avoid the problem of another medical advance that only benefits the rich. And society will benefit when doctors have more information on potential medical issues that could enable patients to avoid costly medical procedures in the future.

Image By: Olsztyn, Poland

The post Non-Standardized Medicine appeared first on Lloyd Marino.

Source: New feed

Do We Need a New Agency to Reveal Software Secrets?

By | Uncategorized | No Comments

By Lloyd Marino

We all know our computers run software.  Indeed, when you bank, use a credit card, make a phone call, or watch television, you are using software. Of course, this is just the tip of the software iceberg. The software that bears the global economy’s collective cross isn’t an app on your smartphone or your computer. They’re massive applications that run Walmart’s supply chain, Amazon’s Enterprise, Resource, and Planning processes, Hertz’ reservation system, and Toyota’s production line, perhaps explaining what prompted professor and vice chair of the Southern Center for Human Rights James Kwak to exclaim in the Atlantic that “Software runs the world.”

Five years ago, Netscape co-founder Marc Andreessen, wrote how software had become vitally important in “Why Software is Eating the World” for The Wall Street Journal. But here’s an interesting twist for today: How can we be sure the software running our lives is living up to its end of the bargain? What’s to stop the manufacturer from embedding software with secret commands, accidently or on purpose?

The truth is, many of the machines we use every day really do have secret software.  

  • Volkswagen has admitted to having secret software in its diesel engines that switched on its emissions controls during testing but off during normal driving conditions. While the company has agreed to pay $14.7 billion in penalties and car buy-backs for this cheat with its 2.0-liter four-cylinder diesel engines, Volkswagen has maintained that its 85,000 cars with 3.0-liter engines do not have this “defeat device.” However, according to Reuters, a respected German publication reported in August that U.S. regulators found secret software in the 3.0-liter engines that shut down emissions controls after 22 minutes, slightly longer than the usual emissions test.  
  • Microsoft’s personal assistant Cortana, which is part of Windows 10, Windows Phone 8.1, and other Microsoft operating systems, answers questions and responds to voice commands. This tracks the user’s location, records and analyzes voices, and may communicate information on people’s writing, calendars, and schedules back to Microsoft. How much data is sent and how does Microsoft use it? Right now, there is no way of knowing.
  • Anyone who watches NCIS or the many similar high-tech television crime solver programs knows that cell phones can be used to track people’s current locations. But you may not know that if you have an Android or Apple phone, your location information is being stored and used to track your frequent locations. And, at least in Android phones, the location history is sent to the company’s servers. While the user can disable this function, in many phones it is turned on by default. An even worse cell phone privacy violator, Carrier IQ, resulted in a $9 million settlement for violating users’ privacy by logging keystrokes, even data on passwords, and potentially sending this information to the manufacturer.

Right now, when we buy software or a device with embedded software, we have to trust the company when it tells us what the software will do. Yes, there are reviews on the web and computer magazines, but there is no one who digs deeply into the software to see if it has secrets the company is not telling us.

One option would be a government agency that can regulate software the way the FCC regulates the airwaves–but better. In a tech-driven economy, the government needs a technology solution. Right now the government does not have enough people who understand technology to examine how technology works in the marketplace nor a public-facing official who takes charge of technology policy on a national or even state level. We need an effective governing body that implements technology solutions and scrutinizes its impact on society.

However, in the current environment, the government cannot do this. The government doesn’t have the people or the know-how. Nor does it have the will. In fact, Congress eliminated the Office of Technology Assessment in 1995, even though that agency simply provided nonpartisan research studies and had no lawmaking or regulatory power.

A better solution may be a private organization to certify software and products with embedded software as safe. This is done in other fields. UL (originally Underwriters Laboratories) tests the safety of products and inspects factories before allowing them to use the UL seal. The Good Housekeeping Institute evaluates products for its effectiveness compared to advertising and packaging claims and then awards its Good Housekeeping seal to products that meet its standards.

An equivalent for software, a Software Examination Entity (SEE), would work with manufacturers to gain access to the software’s actual source code and have independent programmers and engineers examine the code to make sure it works and that there are no hidden surprises. It could then issue its own seal of approval.

Such an independent non-governmental organization may find it easier to gain the cooperation of the software industry than would a government agency with its potential for bureaucracy and regulation. It would not interfere with innovation, nor impose rules on companies. Instead, they would see the group’s seal of approval as a selling point and useful for advertising. Once the first software producer agreed to SEE’s review, all of its competitors would have to join too or risk being challenged on hiding things from consumers.

Of course, software still has enormous potential to bring many benefits to people’s lives. Still, we would be wise to create a way to protect users from hidden traps in their software and act as a watchdog for the industry. In an age of self-driving cars and automated medical car, software can be a matter of life or death. If the government cannot or will not act on its own, the industry itself must be galvanized into action to safeguard its customers and itself.

Image By: Markus Spiske

The post Do We Need a New Agency to Reveal Software Secrets? appeared first on Lloyd Marino.

Source: New feed

Be My Guest

By | Uncategorized | No Comments

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Praesent ullamcorper suscipit mi, id convallis risus ullamcorper eget. Aenean sagittis eros nec eros euismod, quis dapibus leo semper. Mauris rutrum viverra adipiscing. Read More

Nulla Magna

By | Uncategorized | No Comments

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Morbi vitae dui et nunc ornare vulputate non fringilla massa. Praesent sit amet erat sapien, auctor consectetur ligula. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sed non ligula augue. Integer justo arcu, tempor eu venenatis non, sagittis nec lacus. Morbi vitae dui et nunc ornare vulputate non fringilla massa. Praesent sit amet erat sapien, auctor consectetur ligula.