Even with best designed IT architecture, cyber incidents will undoubtedly impact your campaign at some point. Adversaries from hackers to industrial spies and foreign countries will try to access your valuable and confidential information. Fortunately, there are ways to reduce the impact of these attacks. CTOlabs analysts collaborated with community security professionals to produce this special report. It provides strategic guidance on how to improve your security and enhance your campaigns IT functionality to mitigate these threats. Download at: Protect Your Campaign Against Cyber Threats
An Introduction to Recorded Future: An ability to leverage the predictive power of the web
What do you think of when you hear of a company named Recorded Future? My first thought when hearing that cool name is they must have a great capability. Their name invokes a powerful metaphor. What if you could send a video camera into the future, record some things there, and then bring that camera back to today? How could that inform your decisions? As a thought experiment we can clearly say this would be a powerful capability.
Well, we have long known it pays off to study the future. Since we can’t violate the laws of physics or bend the space-time continuum in those ways, analysts seek to predict what will occur by studying the past and building models and extrapolating trends. We do quite a bit of that at CTOvision.com, with a focus on the future of technology. But Recorded Future is approaching this field in a totally new way. They use the predictive power of the web to uncover what people know about the future. They continually scan high-quality news publications, blogs, public niche sources, trade journals, government websites, financial databases and other sources of information. They analyze the content of these sources and identify references to entities and events. They extract publication dates and any temporal expressions in the text. They conduct other assessments on items like the tone of the language. Then they organize and present the results in advanced visualizations ready for human assessment.
We have used Recorded Future in researching our reports and are now planning a deeper dive into their capability. We will be issuing more reports both on their capability and on predictions we believe will be relevant to you our audience.
Related Posts:
Social Media: Change, Control and Security
Rapid Miner: Aiming for effortless predictive analytics
Terrorism Research Center Reconstitutes as Non-Profit Organization
I took great pleasure in reading the release below regarding the reconstitution of the Terrorism Research Center. The founders of the Terrorism Research Center (Matthew Devost, Brian Houghton, and Neal Pollard) are all highly regarded national security professionals and thought leaders who bring years of proven past performance to helping the nation think through some very complex issues. I knew them well in 1996 when they established the first TRC and watched first hand as they provided the community with prescient analysis on issues associated with asymmetric warfare, counterintelligence, cyber security, mission assurance and counterterrorism research.
By bringing this activity back as a non-profit I believe the founders have established a new foundation for even greater contributions to the community.
For more see the release below from http://www.terrorism.org/2012/04/trc-2012/
Terrorism Research Center Reconstitutes as Non-Profit Organization
WASHINGTON DC, April 19, 2012 – The original co-founders of the Terrorism Research Center (TRC) are pleased to announce the organization has been re-established as a non-profit entity to continue the mission of raising public awareness on terrorism related issues and establish a knowledge-base of security related research and analysis. The new organization’s website and public knowledge base can be found online at www.terrorism.org.
First established on April 19, 1996, the year anniversary of the Oklahoma City terrorist bombing, the TRC operated for 15 years as a commercial entity providing research, analysis, and training on issues of terrorism and international security.
The three original co-founders, Matthew Devost, Brian Houghton, and Neal Pollard, will reconstitute a new board of directors, comprised of researchers, first responders and academic and professional experts.
“The TRC had an incredible legacy as a commercial company,” says Matthew Devost. “We believe there is still a strong need to continue the research and collaboration on such critical topics in the public’s best interest.”
From 1996 through 2010, the TRC contributed to international counterterrorism and homeland security initiatives such as Project Responder and the Responder Knowledge Base, Terrorism Early Warning Groups, Project Pediatric Preparedness, Global Fusion Center, and the “Mirror Image” training program. These long-standing programs leveraged an international network of specialists from government, industry, and academia. Reconstituting TRC as a non-profit will help establish the next generation of programs, research, and training to combat the emerging international security issues.
“Thousands of researchers utilized the TRC knowledge base on a daily basis, says Brian Houghton. “Our intent is to open the dialog, provide valuable counterterrorism resources, and advance the latest thinking in counterterrorism for the public good.”
“We want to put the 15-year legacy and goodwill of TRC to continuing benefit for the public, rather than focus on a specific business model,” says Neal Pollard. “TRC was founded in the wake of the 1995 Oklahoma City bombing and made its most significant contributions to the nation and the world after the attacks of September 11, 2001. Now that the War on Terrorism has evolved and the United States is entering a new era of transnational threats, the TRC will maintain its familiar role as the vanguard of next-generation research into these emerging threats.”
ABOUT THE TRC
The Terrorism Research Center (TRC) is non-profit think tank focused on investigating and researching global terrorism issues through multi-disciplinary collaboration amongst a group of international experts. Originally founded as a commercial entity in 1996, the TRC was an independent institute dedicated to the research of terrorism, information warfare and security, critical infrastructure protection, homeland security, and other issues of low-intensity political violence and gray-area phenomena. Over the course of 15 years, the TRC conducted research, analysis, and training on a wide range of counterterrorism and homeland security issues.
More Reading:
- Awake Yet? The list of cyber security wake up calls grows as predicted
- How do you define cyberspace?
- CTO-as-a-Service: When your enterprise needs deep technical expertise on demand
- Cybersecurity and IT standards and standard bodies
For more on these topics see the CTOvision Guide to National Security Technology and
The White House Office of Science and Technology Policy and the National Science Foundation Present: Challenges and Opportunities in Big Data
On Thursday 29 March 2012 from 2:00pm to 3:30pm EDT the White House Office of Science and Technology Policy (OSTP) and the National Science Foundation (NSF) will hold an event focused on “Big Data” Research and Development. Federal government science heads from OSTP, NSF, NIH, DoE, DoD, DARPA and USGS will outline how their agencies are engaged in Big Data research. The event will also include a discussion by a panel of thought leaders from academia and industry, moderated by Steve Lohr of the New York Times.
Our view: This event will be important for many reasons. Here is what to expect:
- Although scientists and technical leaders are the primary participants, all who are participating are also masters at explaining things in plain english. So this will be a GREAT event for those who would like to learn how to explain Big Data concepts.
- This type of evolution can be very helpful to enhancing cross-government sharing of lessons learned and concepts. I believe this will be very positive.
- This initiative will discuss R&D activities. Although at a high level, federal R&D focus is usually expressed in terms of frameworks and broad goals, these goals normally translate into budgets so the research community outside of the government should really pay attention.
- You will be proud of the people speaking at this event. These government servants are smart people doing good things.
Would you like to watch this event? Thanks to the miracles of the modern science you can watch this event right from your web browser. It will be webcast live.
We will be watching here and will capture salient points for summarizing here and in the Government Big Data Newsletter.
SOCOM’s Technology Wish List
Special Operations Command (SOCOM) recently released a Broad Agency Announcement soliciting proposals for several types of technologies that they are looking for in cooperation with the private sector. We are interested in all forms of disrupting technologies and after looking this wish list from SOCOM it is pretty clear they are looking for disruptive technologies as well. Below is a breakdown of what they are looking for. If your company can help in any of these fields you can check to the BAA here.
AREAS OF TECHNOLOGICAL AND SCIENTIFIC IMPORTANCE FOR SRSE (SPECIAL RECONNAISSANCE, SURVEILLANCE, & EXPLOITATION)
Tags – Tagging, Tracking, and Locating technologies. SOCOM is looking for lightweight, low cost, and low energy GPS and non-GPS trackers, biological and chemical taggers, various beacons, audio and video recorders, and improvements in energy use.
Sensors – All types of activity sensors (magnetic, seismic, passive infrared, acoustic, fiber optic and break wire), small radars, through wall imaging, and about any type of other sensor you can think of. Sensors are going to be huge in the future with reductions in manpower and “boots on the ground”.
Biometrics – Facial and Iris recognition, finger print collection, detection and analysis of behavior patterns, voice print analysis, dynamic signature recognition, and DNA collection and processing.
Forensics – Document and electronic media exploitation (computers/cell phones), Trace evidence collection, identification and processing (Explosive/Chemical/Biological/Radiological), and detection of hidden rooms/chambers.
SIGINT– Signals Intelligence. SOCOM is looking for better radios and antennas, and better ways to intercept these types of transmissions. Also improved graphic user interfaces, training system, and network capabilities.
Processing, Exploitation, and Dissemination (PED) – All source data discovery, advanced algorithms for data fusion/analysis, tasking and synchronization, receipt and processing of multiple intelligence ISR sensor data, Open Service Oriented Architecture, Common Interactive Broadcast (CIB) technologies.
Tactical Exploitation of National Capabilities (TENCAP) – This area concentrates on technologies, processing and capabilities to extend National Technical Means investments and capabilities to the lowest tactical echelon user possible. Includes data infix/exfil, force tracking, SIGINT, HUMINT, GEOINT, MASINT, targeting, and Command and control (C2).
Technical Support Systems (TSS) – Virtual Training, camouflage, advanced antenna designs, high efficiency electronics and communications.
Military Information Support Operations (MISO)
Information Management – All types.
Broadcast Technologies– transmitters and antennas, enhanced Bluetooth technologies.
Simulation and Modeling Tools for MISO capabilities
Air Droppable, Scatterable Electronic Media – This media will be used to disseminate information and could take the form of a large variety of broadcast electronic media receivers including miniaturized loudspeakers; entertainment devices; game device technologies, greeting cards; telephony technologies; text messaging; or other media capable of receiving and/or transmitting Internet broadcast or commercial radio frequency signals, or pre-programmed audio/audio-visual data.
Scatterable Media – Multimedia Messaging Service (MMS) Production Technology, Short Message Service (SMS) Production Technology, and Internet Production Technology.
Scatterable Media Integrated Radio, Cellular, Web, and MOP/MOE Requirements – The requirement is a user-generated social media radio application powered by the human voice, available on the PC, Mac, Android, iPhone, and Nokia smart phones, that lets users share their thoughts and experiences.
The list of items and ideas that SOCOM is looking for is exhaustive, but at the same time I would love to be the one getting to evaluate these products as proposals come in. SOCOM has always led the way in military acquisition by taking risks on new technologies and implementing disruptive ones to its smaller and more flexible force. It is likely that these technologies will help keep the United States military, specifically its Special Operations Forces, at the tip of the spear in any conflict in this world.
A CTO Perspective: Consider The Message The Elders of the Internet Have A Message for the U.S. Congress
The EFF (Electronic Frontier Foundation) was founded in 1990 as a donor funded non-profit with a focus on fighting for internet freedoms. They frequently bring those fights to the courts by bringing lawsuits against large corporations and the government. They also work to provide information to inform legislators and the public at large. The EFF is cool, but of course you don’t have to agree with every position they have ever taken. That said, all in all I have to say I think they are focused on very important principles. In my opinion, their best actions are those where they base their positions on well reasoned technical thought. The letter below is one of these times.
In fact, this may be the most well thought out, well coordinated technical based position they have ever established. In an open letter to the US congress coordinated by EFF’s Parker Higgins and Peter Eckersley, 83 of the Internet’s greatest engineers come together to provide a well reasoned opposition to SOPA and PIPA Internet blacklist bills.
Thanks EFF. And thanks for providing your content under the creative commons attribution policy, which makes it so conducive to sharing. That too is nice of you.
This letter is printed below:
From: https://www.eff.org/deeplinks/2011/12/internet-inventors-warn-against-sopa-and-pipa
An Open Letter From Internet Engineers to the U.S. Congress
Today, a group of 83 prominent Internet inventors and engineers sent an open letter to members of the United States Congress, stating their opposition to the SOPA and PIPA Internet blacklist bills that are under consideration in the House and Senate respectively.
We, the undersigned, have played various parts in building a network called the Internet. We wrote and debugged the software; we defined the standards and protocols that talk over that network. Many of us invented parts of it. We’re just a little proud of the social and economic benefits that our project, the Internet, has brought with it.
Last year, many of us wrote to you and your colleagues to warn about the proposed “COICA” copyright and censorship legislation. Today, we are writing again to reiterate our concerns about the SOPA and PIPA derivatives of last year’s bill, that are under consideration in the House and Senate. In many respects, these proposals are worse than the one we were alarmed to read last year.
If enacted, either of these bills will create an environment of tremendous fear and uncertainty for technological innovation, and seriously harm the credibility of the United States in its role as a steward of key Internet infrastructure. Regardless of recent amendments to SOPA, both bills will risk fragmenting the Internet’s global domain name system (DNS) and have other capricious technical consequences. In exchange for this, such legislation would engender censorship that will simultaneously be circumvented by deliberate infringers while hampering innocent parties’ right and ability to communicate and express themselves online.
All censorship schemes impact speech beyond the category they were intended to restrict, but these bills are particularly egregious in that regard because they cause entire domains to vanish from the Web, not just infringing pages or files. Worse, an incredible range of useful, law-abiding sites can be blacklisted under these proposals. In fact, it seems that this has already begun to happen under the nascent DHS/ICE seizures program.
Censorship of Internet infrastructure will inevitably cause network errors and security problems. This is true in China, Iran and other countries that censor the network today; it will be just as true of American censorship. It is also true regardless of whether censorship is implemented via the DNS, proxies, firewalls, or any other method. Types of network errors and insecurity that we wrestle with today will become more widespread, and will affect sites other than those blacklisted by the American government.
The current bills — SOPA explicitly and PIPA implicitly — also threaten engineers who build Internet systems or offer services that are not readily and automatically compliant with censorship actions by the U.S. government. When we designed the Internet the first time, our priorities were reliability, robustness and minimizing central points of failure or control. We are alarmed that Congress is so close to mandating censorship-compliance as a design requirement for new Internet innovations. This can only damage the security of the network, and give authoritarian governments more power over what their citizens can read and publish.
The US government has regularly claimed that it supports a free and open Internet, both domestically and abroad. We cannot have a free and open Internet unless its naming and routing systems sit above the political concerns and objectives of any one government or industry. To date, the leading role the US has played in this infrastructure has been fairly uncontroversial because America is seen as a trustworthy arbiter and a neutral bastion of free expression. If the US begins to use its central position in the network for censorship that advances its political and economic agenda, the consequences will be far-reaching and destructive.
Senators, Congressmen, we believe the Internet is too important and too valuable to be endangered in this way, and implore you to put these bills aside.
Signed,
- Vint Cerf, co-designer of TCP/IP, one of the “fathers of the Internet”, signing as private citizen
- Paul Vixie, author of BIND, the most widely-used DNS server software, and President of the Internet Systems Consortium
- Tony Li, co-author of BGP (the protocol used to arrange Internet routing); chair of the IRTF’s Routing Research Group; a Cisco Fellow; and architect for many of the systems that have actually been used to build the Internet
- Steven Bellovin, invented the DNS cache contamination attack; co-authored the first book on Internet security; recipient of the 2007 NIST/NSA National Computer Systems Security Award and member of the DHS Science and Technology Advisory Committee
- Jim Gettys, editor of the HTTP/1.1 protocol standards, which we use to do everything on the Web
- Dave Kristol, co-author, RFCs 2109, 2965 (Web cookies); contributor, RFC 2616 (HTTP/1.1)
- Steve Deering, Ph.D., invented the IP multicast feature of the Internet; lead designer of IPv6 (version 6 of the Internet Protocol)
- David Ulevitch, David Ulevitch, CEO of OpenDNS, which offers alternative DNS services for enhanced security.
- Elizabeth Feinler, director of the Network Information Center (NIC) at SRI International, administered the Internet Name Space from 1970 until 1989 and developed the naming conventions for the internet top level domains (TLDs) of .mil, .gov, .com, .org, etc. under contracts to DoD
- Robert W. Taylor, founded and funded the beginning of the ARPAnet; founded and managed the Xerox PARC Computer Science Lab which designed and built the first networked personal computer (Alto), the Ethernet, the first internet protocol and internet, and desktop publishing
- Fred Baker, former IETF chair, has written about 50 RFCs and contributed to about 150 more, regarding widely used Internet technology
- Dan Kaminsky, Chief Scientist, DKH
- Esther Dyson, EDventure; founding chairman, ICANN; former chairman, EFF; active investor in many start-ups that support commerce, news and advertising on the Internet; director, Sunlight Foundation
- Walt Daniels, IBM’s contributor to MIME, the mechanism used to add attachments to emails
- Nathaniel Borenstein, Chief Scientist, Mimecast; one of the two authors of the MIME protocol, and has worked on many other software systems and protocols, mostly related to e-mail and payments
- Simon Higgs, designed the role of the stealth DNS server that protects a.root-servers.net; worked on all versions of Draft Postel for creating new TLDs and addressed trademark issues with a complimentary Internet Draft; ran the shared-TLD mailing list back in 1995 which defined the domain name registry/registrar relationship; was a root server operator for the Open Root Server Consortium; founded coupons.com in 1994
- John Bartas, was the technical lead on the first commercial IP/TCP software for IBM PCs in 1985-1987 at The Wollongong Group. As part of that work, developed the first tunneling RFC, rfc-1088
- Nathan Eisenberg, Atlas Networks Senior System Administrator; manager of 25K sq. ft. of data centers which provide services to Starbucks, Oracle, and local state
- Dave Crocker, author of Internet standards including email, DKIM anti-abuse, electronic data interchange and facsimile, developer of CSNet and MCI national email services, former IETF Area Director for network management, DNS and standards, recipient of IEEE Internet Award for contributions to email, and serial entrepreneur
- Craig Partridge, architect of how email is routed through the Internet; designed the world’s fastest router in the mid 1990s
- Doug Moeller, Chief Technology Officer at Autonet Mobile
- John Todd, Lead Designer/Maintainer – Freenum Project (DNS-based, free telephony/chat pointer system), http://freenum.org/
- Alia Atlas, designed software in a core router (Avici) and has various RFCs around resiliency, MPLS, and ICMP
- Kelly Kane, shared web hosting network operator
- Robert Rodgers, distinguished engineer, Juniper Networks
- Anthony Lauck, helped design and standardize routing protocols and local area network protocols and served on the Internet Architecture Board
- Ramaswamy Aditya, built various networks and web/mail content and application hosting providers including AS10368 (DNAI) which is now part of AS6079 (RCN); did network engineering and peering for that provider; did network engineering for AS25 (UC Berkeley); currently does network engineering for AS177-179 and others (UMich)
- Blake Pfankuch, Connecting Point of Greeley, Network Engineer
- Jon Loeliger, has implemented OSPF, one of the main routing protocols used to determine IP packet delivery; at other companies, has helped design and build the actual computers used to implement core routers or storage delivery systems; at another company, installed network services (T-1 lines and ISP service) into Hotels and Airports across the country
- Jim Deleskie, internetMCI Sr. Network Engineer, Teleglobe Principal Network Architect
- David Barrett, Founder and CEO, Expensify
- Mikki Barry, VP Engineering of InterCon Systems Corp., creators of the first commercial applications software for the Macintosh platform and the first commercial Internet Service Provider in Japan
- Peter Rubenstein,helped to design and build the AOL backbone network, ATDN.
- David Farber, distinguished Professor CMU; Principal in development of CSNET, NSFNET, NREN, GIGABIT TESTBED, and the first operational distributed computer system; EFF board member
- Bradford Chatterjee, Network Engineer, helped design and operate the backbone network for a nationwide ISP serving about 450,000 users
- Gary E. Miller Network Engineer specializing in eCommerce
- Jon Callas, worked on a number of Internet security standards including OpenPGP, ZRTP, DKIM, Signed Syslog, SPKI, and others; also participated in other standards for applications and network routing
- John Kemp, Principal Software Architect, Nokia; helped build the distributed authorization protocol OAuth and its predecessors; former member of the W3C Technical Architecture Group
- Christian Huitema, worked on building the Internet in France and Europe in the 80’s, and authored many Internet standards related to IPv6, RTP, and SIP; a former member of the Internet Architecture Board
- Steve Goldstein, Program Officer for International Networking Coordination at the National Science Foundation 1989-2003, initiated several projects that spread Internet and advanced Internet capabilities globally
- David Newman, 20 years’ experience in performance testing of Internet
infrastructure; author of three RFCs on measurement techniques (two on firewall performance, one on test traffic contents) - Justin Krejci, helped build and run the two biggest and most successful municipal wifi networks located in Minneapolis, MN and Riverside, CA; building and running a new FTTH network in Minneapolis
- Christopher Liljenstolpe, was the chief architect for AS3561 (at the time about 30% of the Internet backbone by traffic), and AS1221 (Australia’s main Internet infrastructure)
- Joe Hamelin, co-founder of Seattle Internet Exchange (http://www.seattleix.net) in 1997, and former peering engineer for Amazon in 2001
- John Adams, operations engineer at Twitter, signing as a private citizen
- David M. Miller, CTO / Exec VP for DNS Made Easy (IP Anycast Managed Enterprise DNS provider)
- Seth Breidbart, helped build the Pluribus IMP/TIP for the ARPANET
- Timothy McGinnis, co-chair of the African Network Information Center Policy Development Working Group, and active in various IETF Working Groups
- Richard Kulawiec, 30 years designing/operating academic/commercial/ISP systems and networks
- Larry Stewart, built the Etherphone at Xerox, the first telephone system working over a local area network; designed early e-commerce systems for the Internet at Open Market
- John Pettitt, Internet commerce pioneer, online since 1983, CEO Free Range Content Inc.; founder/CTO CyberSource & Beyond.com; created online fraud protection software that processes over 2 billion transaction a year
- Brandon Ross, Chief Network Architect and CEO of Network Utility Force LLC
- Chris Boyd, runs a green hosting company and supports EFF-Austin as a board member
- Dr. Richard Clayton, designer of Turnpike, widely used Windows-based Internet access suite; prominent Computer Security researcher at Cambridge University
- Robert Bonomi, designed, built, and implemented, the Internet presence for a number of large corporations
- Owen DeLong, member of the ARIN Advisory Council who has spent more than a decade developing better IP addressing policies for the internet in North America and around the world
- Baudouin Schombe, blog design and content trainer
- Lyndon Nerenberg, Creator of IMAP Binary extension (RFC 3516)
- John Gilmore, co-designed BOOTP (RFC 951), which became DHCP, the way you get an IP address when you plug into an Ethernet or get on a WiFi access point; current EFF board member
- John Bond, Systems Engineer at RIPE NCC maintaining AS25152 (k.root-servers.net.) and AS197000 (f.in-addr-servers.arpa. ,f.ip6-servers.arpa.); signing as a private citizen
- Stephen Farrell, co-author on about 15 RFCs
- Samuel Moats, senior systems engineer for the Department of Defense; helps build and defend the networks that deliver data to Defense Department users
- John Vittal, created the first full email client and the email standards still in use today
- Ryan Rawdon, built out and maintains the network infrastructure for a rapidly growing company in our country’s bustling advertising industry; was on the technical operations team for one of our country’s largest residential ISPs
- Brian Haberman, has been involved in the design of IPv6, IGMP/MLD, and NTP within the IETF for nearly 15 years
- Eric Tykwinski, Network Engineer working for a small ISP based in the Philadelphia region; currently maintains the network as well as the DNS and server infrastructure
- Noel Chiappa, has been working on the lowest level stuff (the IP protocol level) since 1977; name on the ‘Birth of the Internet’ plaque at Stanford); actively helping to develop new ‘plumbing’ at that level
- Robert M. Hinden, worked on the gateways in the early Internet, author of many of the core IPv6 specifications, active in the IETF since the first IETF meeting, author of 37 RFCs, and current Internet Society Board of Trustee member
- Alexander McKenzie, former member of the Network Working Group and participated in the design of the first ARPAnet Host protocols; was the manager of the ARPAnet Network Operation Center that kept the network running in the early 1970s; was a charter member of the International Network Working Group that developed the ideas used in TCP and IP
- Keith Moore, was on the Internet Engineering Steering Group from 1996-2000, as one of two Area Directors for applications; wrote or co-wrote technical specification RFCs associated with email, WWW, and IPv6 transition
- Guy Almes, led the connection of universities in Texas to the NSFnet during the late 1980s; served as Chief Engineer of Internet2 in the late 1990s
- David Mercer, formerly of The River Internet, provided service to more of Arizona than any local or national ISP
- Paul Timmins, designed and runs the multi-state network of a medium sized telephone and internet company in the Midwest
- Stephen L. Casner, led the working group that designed the Real-time Transport Protocol that carries the voice signals in VoIP systems
- Tim Rutherford, DNS and network administrator at C4
- Mike Alexander, helped implement (on the Michigan Terminal System at the University of Michigan) one of the first EMail systems to be connected to the Internet (and to its predecessors such as Bitnet, Mailnet, and UUCP); helped with the basic work to connect MTS to the Internet; implemented various IP related drivers on early Macintosh systems: one allowed TCP/IP connections over ISDN lines and another made a TCP connection look like a serial port
- John Klensin, Ph.D., early and ongoing role in the design of Internet applications and coordination and administrative policies
- L. Jean Camp, former Senior Member of the Technical Staff at Sandia National Laboratories, focusing on computer security; eight years at Harvard’s Kennedy School; tenured Professor at Indiana Unviersity’s School of Informatics with research addressing security in society.
- Louis Pouzin, designed and implemented the first computer network using datagrams (CYCLADES), from which TCP/IP was derived
- Carl Page, helped found eGroups, the biggest social network
of its day, 14 million users at the point of sale to Yahoo for around $430,000,000, at which point it became Yahoo Groups - Phil Lapsley, co-author of the Internet Network News Transfer Protocol (NNTP), RFC 977, and developer of the NNTP reference implementation
- Jack Haverty (MSEE, BSEE MIT 1970), Principal Investigator for several DARPA projects including the first Internet development and operation; Corporate Network Architect for BBN; Founding member of the IAB/ICCB; Internet Architect and Corporate Founding Member of W3C for Oracle Corporation
- Glenn Ricart, Managed the original (FIX) Internet interconnection point
- Ben Laurie, Apache Software Foundation founder, OpenSSL core team member, security researcher. Over half the secure websites on the Internet are powered by his software.
- Chris Wellens President & CEO InterWorking Labs
GSA USASearch Wins 2011 Government Big Data Solutions Award
The Government Big Data Solutions Award was established to highlight innovative solutions and facilitate the exchange of best practices, lessons learned and creative ideas for addressing Big Data challenges. The top five nominees and overall winner was announced at Hadoop World in New York City on November 8 2011.
The Government Big Data Solutions Award Program is coordinated by CTOLabs.com. The 2011 judging panel included: Doug Cutting, creator of Apache Hadoop and architect at Cloudera, Alan Wade, former CIA and IC CIO, Ryan LaSalle, Director of Accenture Cyber R & D, Ed Granstedt, Senior VP Director of the QinetiQ Strategic Solutions Center and Chris Dorobek, founder, editor and publisher of DorobekInsider.com.
The top five honorees of the Government Big Data Solutions Award are:
- USA Search: Hosted search services over more than 500 government sites. Provides search and suggestion services plus analytical tool dashboards.
- GCE Federal: Cloud-based financial management solutions.
- PNNL Bioinformatics: Advancing understanding of health, biology, genetics and computing.
- SherpaSurfing: A cybersecurity solution that analyzes trends, finds malware, and writes alerts.
- US Department of State, Bureau of Consular Affairs: Large data set with critically important applications for citizen service and national security.
The winner of the 2011 Government Big Data Solutions Award is the USA Search Program of the US General Services Administration Office of Citizen Services and Innovative Technologies.
Award judges saw the USASearch Program as a great example of solving Big Data problems to improve government agility and provide better service for less. In line with the GSA’s cost-saving “build once, use many times” paradigm, USASearch has provided hosted search services for USA.gov and, through its Affiliate Program, over 500 other government websites. This is done in an incredibly cost-effective way, especially for the agencies involved (which receive these services from GSA for free).
In 2010, USASearch adopted an open architectural model to better exploit shared solutions and open source technology. This model leverages Cloudera’s Distribution including Apache Hadoop (CDH3). The move brought further cost savings and scalable shared search services, which drove up usage. USASearch adopted Hadoop to improve the search results by aggregating and analyzing information on what users are searching, their success in finding it, the time of the search, the affiliate, the results, and which results users click on, among other trends. To do so across hundreds of affiliates with growing traffic, USASearch considered scaling up or dividing its database systems, but they knew that these solutions would be costly and temporary. Instead, it turned to HDFS, Hadoop, and Apache Hive—a big data system that could grow cost effectively and without downtime, be naturally resilient to failures, and sensibly handle backups.
The overhaul of USASearch’s analytics is a dramatic success story. In the space of a few months, USASearch went from having a brittle and hard-to-scale RDBMS-based analytics platform to an agile, scalable Hadoop-based system. By using a state-of-the-art open source technology, USASearch has created a radically different search service that transforms the customer experience Hadoop’s uses and effects will continue to expand as more data sources and tools are added. Having a government-owned and controlled search service allows provides insight on the needs and concerns of Americans to drive enhancements to other delivery channels. The public has a much improved experience when interacting with the government due to USASearch.
The GSA is to be congratulated for their mission-focused, citizen-centered, open approach to a big data challenge and a resulting solution that improves the experience of a broad swath of users of federal services. On behalf of our judges and the many citizens who use this capability on a daily basis we say thank you, and congratulations on this well deserved recognition.
Social Media Mimicry in the Workplace
Let’s face it: the social media that many enterprises hail also pose big problems. Employee use of the Internet–particularly social networks–is a big timewaster. A new Nielsen study found that Americans spend 23% of their online time on social networks, with an increasing amount of that browsing time spend spent on mobile apps. Like the sarcastic Captain Renault in Casablanca, we’re shocked, shocked that a good deal of this browsing time is likely spent during work hours.
Much blog traffic (and some blogging itself!) comes from people browsing during work hours, sapping productivity. The web’s endless amount of lolcats, memes, and demotivational posters aren’t all the products of the stereotyped 12-year olds or basement-dwellers of the popular imagination. Unfortunately, participation in social networks doesn’t just drain employee productivity but also exposes enterprises to malware. Many companies ban Facebook and Twitter to try to stop their employees from using it on the job. A third of British companies, for example, ban social media completely. Others have tried to regulate the time employees spend. Some also believe that banning Facebook would be like “banning the telephone”–a futile and counterproductive endeavor. While this problem is in some ways as old as the Internet and personal computers (anyone complaining about Farmville doesn’t remember MineSweeper), it is also not going away. So how have enterprises dealt with this over the last decade?
Many firms and government organizations have reacted to social media by trying to mimic it within the workplace, not only creating a better means of collaborating but also drawing employees into a ecosystem that shares prominent features of the networks they use for pleasure. Reacting to a slew of popular books and ideas about the wisdom of crowds and the wealth-creating power of networks, these organizations have set up internal wikis, blogs, and social networks. Within the Intelligence Community (IC), Intelink features blogging, a wiki called Intellipedia, and other features of social media.
As Wikipedia founder Jimmy Wales notes, company wikis are widespread–although not all are really useful. The idea that workflow should revolve around collaboration is now mundane rather than revolutionary. Management consultant Ori Brafman, author of the Starfish and the Spider, counts the Army among his many fans. The problem lies in gauging the sincerity of networked approaches. Despite many years of social media and much public enthusiasm over crowds and networks, it is unclear whether or not social media use in the workplace is leading to greater collaboration or simply putting a futuristic sheen on institutions that fundamentally still reflect the influence of Taylorism.
Of course, one problem with social media as it currently exists is the lack of user centralization. Keeping track of blogs, Twitter, Facebook, RSS, Google+, LinkedIn, FourSquare, Friendfeed, and other networks simultaneously requires software solutions. Some applications, most notably the expanded suite of desktop software like TweetDeck or Seesmic Deksptop (two of my favorites) allow you to receive and post content to many different social media accounts at once. But is there anything like this for the workplace?
One collaboration tool in an increasingly crowded market that CTOVision will take a look at in the future is Jive Software’s suite of business collaboration and social software tools. Jive’s collaboration suite creates a unified stream of collaboration applications that much resembles the best features Facebook, Twitter, and even iTunes. It filters to emphasize the most important updates based on rich social information such as favored keywords, use frequency, degree of connection to the poster, and similarity to other things you like. Streams can be “tuned” using like, dislike, and hide buttons. Many features, such as @mentioning, announcements, direct messages, and private threads, are centralized in one place. The task list tracks your other activity, interfaces with apps, and a recommended tab based on commercial “genius” features brings you content that you find valuable. We’ll have more in-depth blogs on this later, but we’re definitely looking forward to taking a look.
Related Reading:
Healthcare Technology and the Growing Importance of Social Collaboration
Social Media: Change, Control and Security
Gain Decision Advantage With Innovative Enterprise Software