IT causes overload and trades off with Humint – that’s Volz and Margolis
TS no resource wars
Our Johnson ev lists multiple other internal links – such as deep animosities between countries and hostile foreign powers
That assumes status quo failure caused by big data
Its vital to solve the largest threats – that’s Johnson
HUMINT key to success to counter state and non-state threats.
Kevin R. Wilkinson – United States Army War College. The author is a former Counterintelligence Company Commander, 205th Military Intelligence Battalion. This thesis paper was overseen by Professor Charles D. Allen of the Department of Command Leadership and Management. This manuscript is submitted in partial fulfillment of the requirements of the Master of Strategic Studies Degree. The U.S. Army War College is accredited by the Commission on Higher Education of the Middle States Association of Colleges and Schools – “Unparalleled Need: Human Intelligence Collectors in the United States Army” - March 2013 - http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA590270
In the twenty-first century, the role of HUMINT is more important than ever. As employed during the Cold War, a significant portion of intelligence was collected using SIGINT and GEOINT methods. The COE assessment now discerns a hybrid threat encompassing both conventional and asymmetric warfare, which is difficult to obtain using SIGINT and GEOINT alone. Unlike other intelligence collection disciplines, environmental conditions such as weather or terrain do not hinder HUMINT collectors.12 HUMINT collection played a key role during Operation IRAQI FREEDOM. OIF was initially a force-on-force ground war using traditional maneuver forces. After six months of conventional conflict and on the verge of defeat, the Iraqi armed forces, with the assistance of insurgents, employed asymmetrical warfare. The continuation of conventional warfare paired with the asymmetric threat created a hybrid threat. HUMINT is effective when countering a conventional threat that consists of large signatures, such as discerning troop movement. However, it becomes invaluable when presented with an asymmetrical threat that entails a smaller signature, such as focusing on groups of insurgents, which other intelligence collection disciplines cannot solely collect on.
TS accumulo solves and no tradeoff
There is a tradeoff – that’s above
( ) Accumulo’s not responsive to our human intel internal link. Even if NSA can process a large quantity of data, the quality’s low unless HUMINT’s involved.
( ) Accumulo fails – Boston Marathon proves it doesn’t find the needle.
Frank Konkel is the editorial events editor for Government Executive Media Group and a technology journalist for its publications. He writes about emerging technologies, privacy, cybersecurity, policy and other issues at the intersection of government and technology. He began writing about technology at Federal Computer Week. Frank is a graduate of Michigan State University. “NSA shows how big 'big data' can be” - FCW - Federal Computer Week is a magazine covering technology - Jun 13, 2013 - http://fcw.com/articles/2013/06/13/nsa-big-data.aspx?m=1
As reported by Information Week, the NSA relies heavily on Accumulo, "a highly distributed, massively parallel processing key/value store capable of analyzing structured and unstructured data" to process much of its data. NSA's modified version of Accumulo, based on Google's BigTable data model, reportedly makes it possible for the agency to analyze data for patterns while protecting personally identifiable information – names, Social Security numbers and the like. Before news of Prism broke, NSA officials revealed a graph search it operates on top of Accumulo at a Carnegie Melon tech conference. The graph is based on 4.4 trillion data points, which could represent phone numbers, IP addresses, locations, or calls made and to whom; connecting those points creates a graph with more than 70 trillion edges. For a human being, that kind of visualization is impossible, but for a vast, high-end computer system with the right big data tools and mathematical algorithms, some signals can be pulled out. Rep. Mike Rogers (R-Mich.), chairman of the House Intelligence Committee, publicly stated that the government's collection of phone records thwarted a terrorist plot inside the United States "within the last few years," and other media reports have cited anonymous intelligence insiders claiming several plots have been foiled. Needles in endless haystacks of data are not easy to find, and the NSA's current big data analytics methodology is far from a flawless system, as evidenced by the April 15 Boston Marathon bombings that killed three people and injured more than 200. The bombings were carried out by Chechen brothers Dzhokhar and Tamerlan Tsarnaev, the latter of whom was previously interviewed by the Federal Bureau of Investigation after the Russian Federal Security Service notified the agency in 2011 that he was a follower of radical Islam. The brothers had made threats on Twitter prior to their attack as well, meaning several data points of suspicious behavior existed, yet no one detected a pattern in time to prevent them from setting off bombs in a public place filled with people. "We're still in the genesis of big data, we haven't even scratched the surface yet," said big data expert Ari Zoldan, CEO of New-York-based Quantum Networks. "In many ways, the technology hasn't evolved yet, it's still a new industry."