Six Approaches to Creating an Enterprise Cyber Intelligence Program

Josh Ray | Jul 30, 2014

According to the Verisign 2014 Cyber Threats and Trends Report, cyber intelligence has matured from an industry buzzword to a formal discipline, which has implications for vendors and security leaders.  As few as seven years ago, cyber-threat intelligence was the purview of a small handful of practitioners, limited mostly to only the best-resourced organizations—primarily financial institutions that faced large financial losses due to cyber crime—and defense and intelligence agencies involved in computer network operations. Fast forward to today, and just about every business, large and small, is dependent on the Internet in some way for day-to-day operations, making cyber intelligence a critical component of a successful business plan. That said, there are a wide variety of ways organizations can go about creating a cyber intelligence program.

I have the unique opportunity to speak with clients and partners on this topic from a variety of different industries as a part of my support for Verisign’s Intelligence-Driven Security program. I’d like to share some pragmatic tactical and strategic approaches to sourcing and applying cyber intelligence that I have gleaned through these activities and my own experience. The following is a brief overview of six approaches, along with key considerations that can help organizations of all types create a cyber intelligence program, build and align to a desired strategy, and create frameworks that -- if executed properly -- can become a defensive force multiplier.

The first two approaches deal with getting back to the basics in determining your program’s desired level of maturity and strategic focus.

Read more

Solving Challenges of Scale in Data and Language

Burt Kaliski | Jul 29, 2014

It would not be too much of an exaggeration to say that the early Internet operated on the scale of kilobytes, with all spoken languages represented using a single character encoding – ASCII.  Today's global Internet, so fundamental to society and the world's economy, now enables access to orders of magnitude more information, connecting a speakers of a full spectrum of languages.

The research challenges continue to scale along with data volumes and user diversity.

Two reports at the recent Verisign Labs Distinguished Speaker Series event held at Verisign's offices in Fribourg, Switzerland -- the first such event in Europe -- underscored the ongoing activity in this area.

The event's first speaker, Prof. Philippe Cudré-Mauroux is the director of the eXascale Infolab at the University of Fribourg.  Exascale is of course the next in the series starting with the kilobyte measure and continuing with mega-, giga-, tera-, peta- and then exa-:  on the order of 1018.

Prof. Cudré-Maroux described his research group's work on Hadaps, a new system for distributing and load-balancing data across servers by taking into account differences in server performance.  He also presented one of the real-world applications of the kind that drive demand for exascale data analysis, an intelligent system for detecting leaks in municipal water systems based on pressure variations reported by sensors.

The remainder of his talk covered a new data publishing platform, the Entity Registry System (ERS).  Designed for semi-connected environments, ERS provides scalability in the broader world where Internet connectivity is not always so reliable.  (ERS was one projects funded in the Verisign Labs Infrastructure Grant program, and previously reported at the December installment of the series.)

Read more

IANA 2.0: Ensuring ICANN Accountability and Transparency for the Future

Keith Drazek | Jun 25, 2014

The National Telecommunications and Information Administration’s (NTIA) March 14, 2014, announcement proposing the transition of its legacy Internet Assigned Numbers Authority (IANA) stewardship role has presented the Internet Corporation for Assigned Names and Numbers (ICANN) multi-stakeholder community equal amounts of opportunity and responsibility. We have been handed a singular opportunity to define the terms of any stewardship transition and the fundamental responsibility to get it right.

Getting it right means ensuring, through a bottom-up, multi-stakeholder process, the reform of ICANN’s accountability structures to protect the community and the multi-stakeholder model prior to NTIA’s disengagement from its oversight and stewardship role. It also means acting quickly and efficiently so our window of opportunity is not missed.

At ICANN’s 50th meeting taking place in London this week, some have suggested that there are “elements” or “forces” among us who oppose the IANA stewardship transition and that calls for accountability reform are tantamount to delay tactics. I have found the opposite to be true. There is significant community support for NTIA’s announcement. There is significant support for NTIA’s four key principles. There is universal support for initiating a bottom-up, multi-stakeholder process to develop a recommended transition plan for NTIA’s consideration. The community also recognizes our limited time to get the work done and the need to propose concrete and implementable enhancements. And, perhaps most importantly, there’s a rapidly growing and strong consensus that ICANN’s accountability reform is a key dependency for any successful IANA stewardship.

On March 24, 2014, at the 49th ICANN meeting in Singapore, Verisign’s Pat Kane, senior vice president of Naming and Directory Services, made the following statements in support of the NTIA announcement, accountability and the multi-stakeholder process:

  • Verisign recognizes that it is probably the right time to transition the IANA functions and stewardship of those functions away from the United States government.
  • Verisign further recognizes that the ICANN community is ready to begin the conversation and its multi-stakeholder, bottom‐up structures have matured and will be the means by which a proposed solution for the transition is developed for continued operations of the IANA functions.
  • We support ICANN as the convener of this process as we find solutions for the clerical, authorizing, and technical operations of IANA which are all tied to accountability to the community.
  • The accountability regime that replaces the NTIA's stewardship should ensure enforceable and auditable transparency and accountability mechanisms. The DNS community and the global business and user communities deserve no less as such mechanisms are critical to the functioning of an open and secure Internet for everyone.
  • We look forward to contributing to the process and the proposed solution.
Those comments were made almost exactly three months ago.  To eliminate any uncertainty around Verisign’s position on the issues of IANA Stewardship Transition and ICANN Accountability, I will take this opportunity to dispel any question or misinformation and reaffirm our views:

  • Verisign supports NTIA’s March 14, 2014, announcement;
  • Verisign supports NTIA’s four key principles;
  • Verisign supports the bottom-up, multi-stakeholder process now under way;
  • Verisign supports the target date of September 2015 for transition;
  • PROVIDED the multi-stakeholder community recommendations for ICANN’s accountability reform are accepted by NTIA before the final transition and sufficiently implemented by ICANN subject to measurable deliverables.
Read more

The Evolving Threat of Amplification DDoS Attacks

Sean Leach | Jun 12, 2014

If there is one trend in the cybersecurity world over the last 12-18 months that cannot be ignored, it is the increasing prevalence and destructive power of amplification-based distributed denial of service (DDoS) attacks.

An amplification attack is a two-part DDoS attack that generally uses the User Datagram Protocol (UDP).  An attacker first sends a large number of small requests to unsuspecting third-party servers on the Internet.  The attacker crafts these requests to result in large responses, but they are otherwise normal except that their source addresses are rewritten (spoofed) so they appear to have come from the victim instead of the attacker.  When all the third-party servers send their large responses to the victim, the resulting amount of traffic is much more than the attacker could have generated alone. These attacks often overwhelm the resources of the victim, as attacks in the hundreds of Gbps are possible using this method.

Two protocols heavily targeted for this technique over the last few months have been the domain name system (DNS) protocol and the network time protocol (NTP).  For example, certain DNS queries sent to an authoritative DNS server will result in responses with a 10-20x amplification factor (e.g. a 40-byte DNS question can result in a 400-byte or greater response).  Attackers can either generate the attack traffic themselves or use a botnet of compromised PCs to hide their footprints, but in either case they take advantage of a huge number of "open" DNS servers on the Internet that will respond to any request sent to them.  This is a relatively easy technique that has proven successful in launching very large-scale attacks (i.e., several hundred Gbps in size).

NTP is commonly used to synchronize electronic clocks so computers around the world all agree on the time. This is critical to the functioning of electronic commerce.  NTP relies on the UDP protocol just like DNS does and it is vulnerable to similar attacks.  One particularly damaging NTP attack uses the MONLIST command, which is found in older NTP servers.  MONLIST returns the last 600 clients that an NTP server has talked to, which results in responses with an amplification factor of 10-200x with just a single NTP server.  Attacks that combine thousands of NTP servers can do incredible damage while using very little of the attacker's resources.  

In the first quarter of this year, Verisign DDoS Protection Services saw an 83 percent jump in average attack size over Q4 2013, which was primarily attributed to NTP-based attacks.  While DNS amplification was the most common vector in 2013 and continues to be seen, the NTP attack type is the largest attack vector seen this year. We mitigated multiple amplification attacks -- commonly ranging from 50 to 75 Gbps -- for our customers.  Directly related to the popularity of amplification attacks was the sharp decline in more complex application-layer attacks.  With so many vulnerable NTP servers and reflection vectors readily available on the Internet, attackers were able to cause maximum disruption with minimum effort on their part, ditching smarter layer-7 attacks in favor of volume-based amplification attacks (Read Verisign’s Q1 2014 DDoS Trends Report for more information.).

Read more

Verisign Named to the OTA’s 2014 Online Trust Honor Roll

Blog Moderator | Jun 11, 2014

We are pleased to announce that Verisign has made the 2014 Online Trust Honor Roll for demonstrating exceptional data protection, privacy and security in an effort to better protect our customers and brand from the increased threats of cybercriminals.  

The Online Trust Alliance (OTA), a nonprofit organization that works collaboratively with industry leaders to enhance online trust, completed comprehensive evaluations of more than 800 sites and mobile applications by analyzing companies’ data protection, security and privacy practices, including over two-dozen criteria. In total, approximately 10,000 webpages and more than 500 million emails were reviewed.

In addition to the in-depth analysis of the recipients’ websites, domain name systems (DNS), outbound emails, and public records were also analyzed for recent data breach incidents and Federal Trade Commission (FTC) settlements.  Key sectors audited include the Internet Retailer 500, FDIC 100, top 50 news, social media and government sites, as well as OTA members.

Thirty percent of the companies reviewed made the Honor Roll, with 22 percent making it consecutively for the last two years, including Verisign. To review the full 2014 Online Trust Honor Roll report, download a free copy at: 2014 Online Trust Honor Roll.