Web Admin Blog

Real Web Admins. Real World Experience.

Thoughts on the TRISC 2009 Conference

This was my third consecutive year attending the TRISC Conference and it gets better and better every year.  This year, the location was outstanding, the presenters were top-notch, and the Keynotes were pretty good.  This was my first time actually presenting at the TRISC Conference and I thought they did an excellent job from the presenter point-of-view as well.  They kept the presentations on time, they had my notes all printed up and ready for attendees, and A/V equipment worked well.  No complaints from me there.

My favorite Keynote speaker was far and away Johnny Long.  His talk was on “No Tech Hacking” and he is as entertaining as he is talented.  If you ever get a chance to see him speak, definitely do so.  Also, be sure to check out his website at IHackCharities.org.

My least favorite Keynote speaker was Ken Watson.  He spoke all monotone and the presentation on these centers around the country that the government is using to team up with industry to prevent attacks on critical infrastructure was pretty lame.  I guess I just expected more and from talking with others it seems like I’m not alone.

My favorite presentation was Robert Hansen and Rob MacDougal’s talk on “Assessing Your Web App Manually Without Hacking It”.  It was a simple concept that everyone from managers to developers to IT guys can follow to get an idea as to how many vulnerabilities their application might contain.  RSnake!

My least favorite presentation was “The Importance of Log Management in Today’s Insecure World” by Ricky Allen and Randy Holloway from ArcSite.  Too vendory, not technical enough, and kinda a lame presentation in general.  Maybe I’m just bitter because I heard that the other presentations that took place while I was in this session were really good.

This was the first year that TRISC had a Casino Night and it was awesome.  I played Texas Hold ‘Em most of the night and took Nathan Sportsman’s money and a bunch of Rob MacDougal’s as well.  They had Roulette, Blackjack, and Craps tables there as well and the goal was to start with $10,000 in chips and for every $5,000 you had at the end of the night you got a raffle ticket.  I ended up with over $40,000 and 9 raffle tickets and won three different items.  Score.

Overall, TRISC 2009 was not the best conference that I’ve ever attended, but was certainly the best TRISC to date.  I was very impressed and am looking forward to next year.  FYI, all presentations from the conference are online and available for viewing here.

Anatomy of an Attack: From Incident to Expedient Resolution

For the first session of the morning on the last day of the TRISC 2009 Conference, I decided to attend the “Anatomy of an Attack: From Incident to Expedient Resolution” talk by Chris Smithee, a Systems Engineer at Lancope.  He talked about the different types of attacks that you see on your network and how using FLOW data can be used to monitor and eliminate some of these types of threats.  My notes from the session are below: [Read the rest of this entry…]

PCI Compliance – Convert Drudgery Into a Powerful Security Framework

For my last session of the day at TRISC 2009, I decided to attend Joseph Krull’s presentation on PCI Compliance.  Joe works as a consultant for Accenture and has performed 60+ PCI engagements for various companies.  If your organization does any processing of credit card information, my notes from that session below should be useful:

  • As many as 65% of merchants are still not PCI compliant
  • Fines can be just the beginning; service charges and market share price dilution for non-compliant merchants have already had substantial repercussions in the US and may soon reach other regions·
  • Many retailers still don’t have a clear view of compliance, and cannot effectively identify gaps
  • The first steps to PCI compliance are a thorough internal assessment and gap analysis – many merchants skip these steps and launch multiple costly projects
  • PCI provides a regulatory and compliance framework to help prevent credit card fraud for organizations that process card payments
  • The framework is comprehensive and effective but adherence to the specific standards is often challenging – primarily due to the complexities involved in both program design and implementation
  • Any merchant that accepts or processes credit cards must maintain compliance with the PCI DSS.  Specific obligations vary based on transaction volumes.
  • Focus right now is on the Level 4’s.
  • TJX subject to 20 years of mandatory computer systems audits after massive breach

Challenges

  • Providing adequate and clear program management for all of the entire spectrum of PCI remediation activities (60-70% give to “Compliance guy” and typically fail.  Should go to senior security guy)
  • Accurately scoping requirements throughout the organization, including remote sites and international operations
  • Evaluating and then implementing a wide variety of complex technologies – including encryption
  • Redesigning or replacing internal applications and payment systems to adequately protect cardholder data
  • Developing, implementing and enforcing new or revised policies and procedures across the entire organization
  • Differing opinions with auditors regarding PCI compliance requirements, especially related to the concept of “Compensating Controls”
  • Verifying PCI compliance for 3rd party partners that process data on behalf of the merchant

Differences from PCI DSS 1.1 to 1.2

  • Active monitoring plans for all 3rd party PCI Service Providers (Requirement 12.8)
  • Visits to offsite data storage locations at least annually
  • Mandatory phase out of weak encryption for wireless networks
  • Additional requirements for the use of “Compensating Controls” for specific PCI security requirements
  • Assessor testing procedures changed from “Observe the use of…” to “Verify the use of”
  • Quality assurance program for PCI assessors
  • Process restricts or eliminates assessors from performing PCI work due to poor quality assessments
  • Assessors must now go beyond cursory observation of security controls and provide statistical samples
  • Assessors now going much deeper to include verifying individual system settings, requesting and analyzing configuration files, studying data flows, …

The Cost of Compliance and Non-Compliance

  • According to a comprehensive Forrester Research report on PCI compliance, companies spend between 2%-10% of their IT budget on PCI compliance
  • Credit card companies are levying fines on non-compliant merchants
    • Up to $25,000 per month for each month of non-compliance for L1’s ($5,000 for L4’s)
    • $10,000-$100,000 per month for prohibited storage of magnetic stripe data
    • Up to $500,000 per incident if a confirmed compromise occurs
    • Continued non-compliance may result in revocation of CC processing privileges
  • Banks and acquirers may increase processing fees for non-complinat merchants.  In 2008, one retailer estimated an annual increase in operational costs of $18 million due to this increase in processing fees on VISA card transactions alone.
  • Banks and acquirers can often pass on damages they incur to merchants
  • Repeat or additional PCI assessments & internal audits

Corporate Compliance Framework

  • Although PCI provides compliance requirements in most areas, it’s only a subset
  • ISO 27002:2005 is what they used for PCI
  • Good general requirements, but no explanation on how to do it
  • PCI sets best practices
  • For example, ISO 5.1.1 maps to PCI 12.1, 12.4, and 12.6.2

How to “Sell” PCI Compliance to Senior Management

  • Gloom and Doom
    • Fines and sanctions will sink us
    • Probability of success 40-50%
  • The PCI Umbrella
    • We need these 15 projects and ten new security products to be PCI compliant
    • Probability of success 40-50%
    • Who has done the gap assessment
  • The Long Term Approach
    • If we achieve PCI compliance we will also be well on our way to other requirements
  • PCI compliance is not a project or technology based solution – it is being able to demonstrate that an organization has the means in place to protect sensitive information
  • Use as a building block to sell to senior management

Security Policy Architecture – How to fix your current disaster

One of the sessions that I attended during the day on the Tuesday of TRISC 2009 was by Doug Landoll from Lantego on “Security Policy Architecture”.  The presentation was a very good overview of how to put good security policies in place that are easily auditable should that need arise and that are as comprehensive as necessary.  The actual presentation slides are available here and because he had some very good visual aids in his presentation, I’m going to just recommend that you check out the actual slides.  My notes, however, are below just in case the slides ever get deleted for some reason:

Importance of Security Policies

  • Govern expected behavior and process
    • Expected and prohibited behavior
    • Security process
  • Establishes roles and responsibilities
    • Management & oversight
    • Execution
  • Define protection measures
    • Access controls
    • Physical security measures
    • Monitoring, audit, and oversight
    • Response priorities

Hazards of Weak Security Policies

  • Unclear expected behavior
    • Personnel guess at what is allowable & expected
    • Minor “infractions” – undefined & unnoticed
    • Leads to eroding culture of trust
  • Unclear roles and responsibilities
    • No oversight – administrator actions go unchecked
    • No management – activities according to whim
  • Unclear protection measures
    • “Heroes” define network security
    • Extremely tech-centric security posture

Security Architecture Mistakes

  • Mixed audience policies
    • Ex: Encryption policy
      • Use of encryption – users
      • Selection of encryption algorithms – system owners
      • Implementation of encryption – custodians
      • Key escrow – system owners
      • Oversight – auditors/management
    • Ex: Security Updates
      • Do not block network updates – users
      • Patch every Tuesday – admins
  • Who is the audience?

Common Policy Architecture Mistakes

  • One topic = one policy
  • Magic Policies
    • Templates
    • Handbooks
  • Pros
    • Solves the “blank piece of paper” problem
  • Cons
    • Old
    • No consideration for your environment, culture, or organization
    • Discourages analysis
    • No SME (Subject Matter Expert) involvement
    • Thwarts adoption
  • Match policy to requirements
    • PCI Policy project
    • HIPAA Policy project
    • TAC 202 Policy project
    • Etc
  • Problem
    • Requirements by controls
    • Policies organized by audience & topic

Clean Slate Approach

  1. Assess what you have
    • Independent & complete review process
  2. Determine controls framework
    • COBIT, ISO 27001
  3. Map in requirements
    • PCI DSS, HIPAA, TAC 202
  4. Organize create policy statements
    • For each control (rows) and requirement (column)
  5. Create policy architecture
    • According to audience & topic

Policy Assessment Approach

  • Step 1 (Essential Elements Checklist)
  • Steps 2 (controls & framework) & 3 (map requirements)
  • Steps 4 (policy statements) & 5 (policy architecture)

Conclusion

  • Administrative Controls
    • Management, oversight, process
    • Address organizational and insider issues
  • Lack of policy architecture
    • Leads to weak administrative controls
    • Unplanned technology implementation
      • “implementation by appointment”
  • Ensure your controls are complete
  • Reaction is NOT a strategy (Don’t do it because a vendor called you or because an auditor said to do it)

Deep Packet Inspection and the Loss of Privacy and Security on the Internet

For my first session of the day on Tuesday of the TRISC 2009 conference I attended a presentation by Andrew MacFarlane from Data Foundry, Inc. on “Deep Packet Inspection and the Loss of Privacy and Security on the Internet”.  While the concept of DPI is nothing new to me and I remember first hearing about it around the FBI’s Carnivore project, this particular use case was something that I hadn’t heard about.  Apparently pretty much every Tier 1 ISP has hopped onboard the DPI bandwagon and is now using the technology for everything from traffic prioritization to targeted advertising.  To make matters worse, you automatically agree to this type of monitoring by accepting your ISP’s terms of service.  Data Foundry has been one of the few ISP’s who have spoken out against this practice, but unless more people (especially end-users) lobby their congressmen to remove this waiver of privacy rights as part of our terms of service acceptance, the future of privacy and security on the internet is awfully bleak.  My notes from the session are below:

[Read the rest of this entry…]

The Importance of Log Management in Today’s Insecure World

For my last session of the first day of the TRISC 2009 Conference, I made the mistake of attending Ricky Allen and Randy Holloway’s presentation on “The Importance of Log Management in Today’s Insecure World”.  I say “mistake” because out of all of the presentations I attended over the entire three days of the conference this was by far the most vendory, the least security oriented, and the worst presentation.  Both of these guys work for ArcSight and while they certainly know their log managment, it was just a lame excuse for a presentation and if I was able to go back in time I would have attended Chip Meadows’ presentation on “Pocket protectors, Purple hair and Paranoia” instead as I heard he did a fantastic job.  Anyway, my notes from this presentation are below and the actual slides can be found here:

What is log management?

  • Ensuring your enterprise log data is accessible, easily retrievable and forensically sound
  • Properly dealing with mammoth amounts of event data stores in thousands of vendor generated log files
  • Achieving compliance (SOX, HIPAA, PCI, FISMA), Security and IT operation usage of log data that does not break the bank
  • Log data now represents over 30% of ALL data generated by enterprises – creating a real need for log management
  • Dominant uses for log data include:
    • IT operations – systems/network health and availability
    • Security monitoring – perimeter or insider threat detection
    • Compliance monitoring – for regulations and industry standards

Why should I care?

  • Overwhelming flood of logs
  • Islands of defense
  • Week long manual investigations
  • Massive false positives
  • Heterogeneous consoles
  • Many different formats
  • Regulations and their commonly used frameworks impose various requirements when it comes to log management
  • Regulatory mandates have further increased log retention requirements
  • Increased need to store both security and non-security
  • There continues to be an increased emphasis on audit quality data collection
  • Regulatory requirements
    • SOX: 7yrs
    • PCI: 1yr
    • GLBA: 6yrs
    • EU DR Directive: 2yrs
    • Basel II: 7yrs
    • HIPAA: 6/7yrs
  • Compliance requirements
    • More logging
    • More types of devices
    • Higher volumes of log data
    • Extensive reporting requirements
    • Broader user access
    • Long term retention requirements
    • Audit quality data

What can effective log management do for me?

  • Self-managing & scalable
  • Automated & cost-effective audits
  • IT Operations SLA Efficiency
  • Compliance
  • Simplified Forensic Investigations

Best Practices – NIST 800-92

  • Common log management problems
    • Poor tools and training for staff
    • Laborious and boring
    • Reactive analysis reduces the value of logs
    • Slow response
  • Solutions
    • Establish log management policies & procedures
    • Prioritize log management appropriately
    • Create and maintain a secure log management infrastructure
    • Provide proper support for all staff with log management responsibilities
    • Establish standard log management processes for system-level admins
  • The directive to only log and analyze data this is of the greatest importance helps provide sanity to the logging process
  • Collecting and storing all data regardless of its usefulness increases complexity and deployment costs
  • Secure storage and transmission guideline directly points to the importance of secure and robust capture, transmission and storage of logs
  • Organizations should carefully review the collection architecture, transmission security and access control capabilities of SEM solutions to ensure support of this section of the standard
  • Filtering and aggregation are recommended as a means to only capture logs of security and compliance value based on the corporate retention policy
  • Guideline helps organizations support a “reasonableness” position in not collecting useless log data

Developing a Log Management Program

  • Understand your log management needs (regulatory and operational requirements)
  • Review NIST 800-92
  • Understand your environment
    • Lots devices to collect logs from
    • Multiple locations with no IT staff
    • Collection agents are not an option
    • Network time settings
    • Low bandwith links
  • Devices
    • Firewalls/VPN
    • IDS/IPS
    • Servers and desktop OS
    • Network equipment
    • Vulnerability assessment
    • Anti-virus
    • Applications
    • DBs
    • Physical infrastructure
  • Establish prioritized log management policies & procedures

Log Management Checklist

  1. Scalable architecture
  2. Minimal footprint at remote sites
  3. Transaction assurance
  4. Audit and litigation quality data
  5. Universal event collection
  6. Ease of manageability
  7. ….

Assessing Your Web App Manually Without Hacking It

After giving my presentation on “Using Proxies to Secure Applications and More” at the TRISC 2009 conference, I decided to attend the presentation by Robert “RSnake” Hansen and Rob MacDougal entitled “Assessing Your Web App Manually Without Hacking It”.  The gist of this presentation was that with a few simple tools (Web Developer Toolbar, NoScript, you web browser) you can spend about an hour looking at the characteristics of a web application in order to determine what types and how many vulnerabilities it may have.  My notes on the presentation are below:

[Read the rest of this entry…]

Spear Phishing – Breaking Into Wall Street & Critical Infrastructure

For my first breakout session of the TRISC 2009 Conference, I decided to check out Rohyt Belani’s presentation on Spear Phishing.  Rohyt is the CEO of Intrepidus Group and has spoken at a variety of conferences from BlackHat to OWASP to MISTI to Hack in the Box.  I had heard from several other conference attendees that he was a pretty good speaker and the topic seemed interesting enough so I went and wasn’t at all disappointed.  My notes (while not very long) from the presentation are below and the actual presentation can be found here:

  • CEO of Intrepidus Group
  • Adjunct Professor at Carnegie Mellon University
  • Frequent speaker at BlackHat, OWASP, MISTI, Hack in the Box
  • Phishing: The act of electronically luring a user into surrendering private information that will be used for identity theft or conducting an act that will compromise the victim’s computer system.
  • Example of spear fishing used for pump-and-dump scam
  • Example of spear fishing used to download a Trojan, crack the admin password, and create domain administrator accounts on a windows server.
  • Have a service called fishme.com that is used to run mock attacks against companies.
  • 23% +/- 3% are susceptible to phishing attacks based on surveying on fishme.com
  • Convincing people to click via authority works better than reward
  • People are more “click happy” on a Friday afternoon
  • Use an existing website that’s vulnerable to XSS or create a fake SSL certificate

State of SOA

We have a pretty decent-sized SOA implementation here.  I got interviewed by InformationWeek magazine about it, and you can read my thoughts on SOAP vs REST and related topics in:

InformationWeek Analytics: State Of SOA

Browser Support – Just Do It

I am moved to post today by a gripe.  We have a lot of products and SaaS vendors that for some reason feel like they don’t need to support browsers other than whatever it is they have in their mind as the one browser they’re going to support.   I have Firefox 3, Internet Explorer 8 beta, and Chrome on my PC but still can’t use many of the darn programs I needed to use today.  (Of course, you can’t run different IE versions on the same box without resorting to virtualization or similar, so once I went to IE8 beta I knew I was in a world of hurt).

Let me share with you the top 10 browsers we see on our Web site.  These numbers are from the last 500k visits so they should be statistically representative.

  • IE7 – 34.9%
  • Firefox – 31.0%
  • IE6 – 25.9%
  • Safari (includes Chrome) – 4.1%
  • Opera 9 – 2.3%
  • IE8 beta – .9%
  • Mozilla – .4%
  • Charlotte – .1%
  • Yeti – .1%
  • IE5 – .1%

All you suppliers who think “I don’t need to support Firefox” – think again.  And you’re all doing a bad job of supporting IE8.  I know it’s new – but if you’ve already been only supporting one browser, be advised that as soon as IE8 goes gold everyone will auto-download it from Microsoft and then you’re SOL.   And there’s a lot of IE6 out there still, even if you are trying to do “IE only.”

To name names – Peopleclick.  IE7 support only.  Really?  You really only want 35% of users to use your product?  Or you think we’re going to mandate an internal company standard for your one app?  Get real.

Sharepoint.  No editing in Firefox.  When we evaluated intranet collaboration solutions here, we got down to Atlassian Confluence and Sharepoint as finalists, but then the “no Firefox” factor got Sharepoint booted for cause.  Confluence itself doesn’t support Safari until its newest version, which was annoying.  (Microsoft does promise the new version of Sharepoint out later this year will have adequate Firefox support.)

Graphs don’t work right in Firefox in Panorama, otherwise a pet favorite APM tool.

So guys – I know it’s a pain, but the Windows browser market is split and Macs are undergoing a renaissance.  Real companies don’t tell 5 to 10 percent of their customers to screw off (let alone 65%, Peopleclick).  It’s a cost of doing business.   You’re getting out of a whole bunch of client side code writing by cheating and using Web browsers for it, so be grateful for that rather than ungrateful that you have to test in a couple different browsers.  Because corporate decisionmakers like myself will ask, and we will make buying decisions based on it.