Operationalizing Open Source for Homeland Security

Sitting in the panel run by the Department of Homeland Security. DHS Open Source Lead, Tyler Foulkes, leads the conversation. In 2007 DHS required to build relationships with the State Fusion Centers and train the Fusuin Centers. Undersecretary Charlie Allen at DHS understood that intelligence happens in other places than the IC. They needed to find out what the state, local and tribal leaders need to complete their missions. Training for the Fusion Centers (from DHS) goes out to the centers, they don’t force the Fusion Centers to come to them. Notes the DHS Strategic Open Source Vision booklet (released today). Protecting rights and keeping balance on privacy is key and on their minds at all time.

Next up is Jack Showalter for CENTRA Technology Inc, speaks on training the analysts and not the IT staff on the technical aspects of research on today’s web. Adhoc vs standing requirements and how to use different technologies for each part of the mission.

First major theme: adhoc requirements. Obvious technology is search (search engines). To go beyond google, training on how the search engine works so the analysts know what they are getting and what they are missing. They need to know the periodicy of search engines and how the search engine gets it and why moving beyong Google and using any search engine effectively is so important.

Going from advanced syntax search on Google to clustering search engines (like Clusty.com). Demos other clustering search engines with visualization like Kartoo.com/ DIfferent analysts think differently and can get results tailored to their style.

Concept of verticle search is introduced. Shows Highbeam Research, Infomine, the NOAA National Weather Service and Search Medica.

Discusses Cuil.com and the need to pay attention to new resources to keep an eye out for the “next big thing” – whether they succeed or not.Looking to the horizon for emerging resources, like evri.com (natural language processing – semantic indexing). Mentions twitter (just as I send a tweet…). And twitter trends – get the news about disasters or events from locals on the twitter trends before the press gets it out (search.twitter.com).

Goes into the importance of directories when exploring topics. Mentions dmoz.org and lii.org. Notes the importance of noting the business process of the directories (ie. volunteer or professional maintainence).

Discusses the deep web: what it is and how to tackle it.

Different needs to meet “standing requirements” – repetitious and mechanical searches should be automated. Identify important -vs- urgent taskings. Addresses time consuming nature of standing requirements and that adhoc requirements often push the standing reqs out of the picture. Obvious first strategy: RSS. Not only RSS feeds, but filtered RSS (shows feedrinse.com as an example).

Highlights distributuion channels for open source and shows Deborah Osborne’s crime analysis podcast on blogtalkradio.This can be a method of professional development.

Highlights the need for open source professionals to be on the watch for new technologies and resources.

Q&A:

Q: have you found a way to search podcast for content?

A: currently we haven’t found a way to search podcast effectively.

Q: what are different research methods you teach analysts?

A: Originally analytic techniques were taught but some were cut due to time constraints. Time management and research planning are taught. The end goal is to make sure that after the research there is time for analysis. This is the first wave of classes but as the program continues, more techniques and further topics will be explored and trained.

Q: Comment: as far as searching podcasts – podscope.com and everyzing.com.

A: Fantastic, we will explore that.

Q: Are you targeting media outside of the internet?

A: major block of training is on non-internet open source.

Q: FeedRinse, is that client or server based? Have you discovered any attempts to give you misinformation?

A: Server based. Another major block in the training is focused on evaluating sources, misinformation and disinformation.

Q: Concerns about much of internet going through the US.

A: Particularly with ref to IPV6, the next version, the US will not be the belly button of the internet. We discuss assessing the credibility of sources used, but we don’t go into the weeds of the technicality of the internet and we cover the basics before delving into deep waters.

Q: Software applied to do trends and word counts (note: memes)

A: Discussions on memes and conversation tracking through the blogosphere. Tag clouds, etc.

Q: Comment: Traffic diverted through foreign servers is more of an issue for covert operations and not for open sources.

Q: How are you addressing operational security?

A: we describe web visibility and basics of IP statistics. We demostrate what the systems are showing when they visit a website and how to use basic opsec to counter these weeknesses.

Advertisements
Explore posts in the same categories: 2008, Conference

Tags: , , , , , , , ,

You can comment below, or link to this permanent URL from your own site.

One Comment on “Operationalizing Open Source for Homeland Security”

  1. das Says:

    One note on the IPv6 issue. Everyone, including the United States, is going to have to transition to IPv6, because it is inevitable that the addresses in the IPv4 address space (what we use now) will eventually be exhausted. This is a global issue, and no one is immune. IPv6 is an open international standard.

    See http://en.wikipedia.org/wiki/IPv4#Exhaustion and http://en.wikipedia.org/wiki/IPv6 for more information.

    It isn’t IPv6 that will make the US any less dominant; the issue is that some nations, like China, have sought changes to DNS that threaten to fragment the internet.

    The question was related to recent stories about global internet traffic increasingly not being routed through the US. This is absolutely a concern, but it is a concern from a standpoint of covert collection; that is, monitoring that traffic, as oppose to overt searching on the accessible internet, which is unrelated to traffic routing. Only fragmentation of the internet would impact open source activities.

    (Just to be absolutely clear, IPv6 could effectively “fragment” things, to an extent, while it is being adopted, which will be a long and difficult transition. But IPv6 is the next generation internet protocol as defined by international standards bodies; it’s not a “competitor” to IPv4, just its successor.)


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: