Analysis

21.06.2012 Analysis, CCTV, public safety, security threats, video, video surveillance Comments Off on LEIM 36th Annual IACP: Baltimore Police Department Incorporating Video Technology to Reduce Violence

LEIM 36th Annual IACP: Baltimore Police Department Incorporating Video Technology to Reduce Violence

This is one presentation I definitely wanted to attend at LEIM. Yes, the real Police of Baltimore were here. Not McNulty and his gang from The Wire, but Deputy Commissioner John Skinner and members of the Baltimore Police Force came to discuss how they are combating violence in Baltimore using video technology.

Deputy Skinner opened up the presentation to tell us that quite a few years ago, Baltimore was America’s Murder capital. In 1995, there was an estimated 1 homicide per day in Baltimore. By 2011 this was down to 197 in a year. Since 2007 they have achieved a 35% reduction in non-fatal shootings and juvenile homicides have decreased since 2008. By using technology, Baltimore Police have achieved historical lows in violent crime, whilst reducing budgets they have redirected the resources they have at the Police Department.

Gayle Guilford, Systems Director for Baltimore PD explained the Side Partner Project. This initiative came out in 2009. The aim was to ‘get Police out of their cars and back into the community’. Gayle spoke about years ago, when Officers would ‘walk the beat’, and be an integral part of the community. They knew people by name, and would walk around the neighborhoods, speaking to people and generally being involved in the daily life around them. This was comforting for people, and they trusted their Police.

Since taking police of the streets and into patrol cars, citizens have become distanced from the police and possibly lost the trust that was once had. To combat this and to get the Police back onto the streets, Blackberry phones with a “Pocket Cop” application were handed out to the Officers. With the phone in their hand, they can carry out checks on warrants, driving records and photographs. It also tells them who they should be looking out for, and what their daily priorities are.

The system also allows Officers to start taking evidence such as photos and streaming video before the forensic teams arrive. This is very useful in domestic violence situations. They can immediately upload evidence and get information out to other officers who can assist if they are nearby, if they have to look for a suspect, they will have a photograph to help them.

The application is also wired up to GPS and Google maps. This can tell Police dispatchers where officers are located and their availability to respond to a situation. Gayle hopes that future budgets will allow that every Police Officer will have a blackberry in the next few years.

Next to speak was Lt. Hood, Director of Law Enforcement Operation for CitiWatch in Baltimore PD. Lt. Hood is one of those types of people that are immediately likeable, and when he began his presentation on CCTV, I was enthralled.

The CitiWatch program is one of America’s most sophisticated surveillance networks in operation. It started off with 50 cameras and now has over 500 across Baltimore City. The cameras are monitored by the Criminal Intelligence Watch Center inside the Baltimore PD, 24 hours a day, 7 days a week by specially trained CCTV operators, mainly retired Police Officers.

Extra staff are used at weekends to monitor the busiest times, especially at night. Lt Hood likes using retired Police as they have the knowledge and are also trained on a variety of subjects including radio communications and uniform crime reporting.

CitiWatch cameras records footage, which is stored for 28 days and then archived off site. Video footage is crucial in identifying suspects and capturing evidence in many crimes. Lt Hood then proceeded to show us some footage, in which the Police were able to make subsequent arrests and help also to prevent certain criminal acts.

In many instances where a crime was being committed, it was monitored by staff, and Police were alerted to see who was the nearest to where the crime was taking place. They could then go to the scene and take control of the situation.

Many would feel that we have reached the ‘Big Brother is watching you’ stage considering how many cities around the world are using camera technology, but in reality those cameras are not there to spy on the public, but instead are there to help us and keep us safe by watching out for the bad guys.

Lt. Hood and his team proved this. I spend a lot of time in Baltimore, and I am happy that Lt. Hood is watching out for us, and I for one walked away from that presentation feeling a little bit safer.

For more information and to find out more about Baltimore Police check out:

www.baltimorepolice.org

Till next time…

r/Mary

 

15.09.2011 Analysis, law enforcement, security, Uncategorized, video analysis, video analytics Comments Off on Video Analysis/Analytics: Can we use it to detect criminal behaviors and activities?

Video Analysis/Analytics: Can we use it to detect criminal behaviors and activities?

I just found this report published by the National Criminal Justice Reference Service (NCJRS). Developed by Nils Krahnstoever, General Electric (GE) Global Research, it describes the development of a wide range of intelligent video capabilities relevant to law enforcement and corrections, and describes features of video surveillance that can help to enable early detection and possibly prevention of crimal incidents.

The study also points out, in a number of places, limitations of the technology, based on response activities and envronmental factors. it’s worth a read, here is the table of contents; you can read the document here Automated Detection and Prevention of Disorderly and Criminal Activities:

 Table of Contents

  • 1 Abstract
  • 2 Executive Summar
    • 2.1 Data Collection
    • 2.2 Crime Detection and Prevention
    • 2.3 System Evaluation and Feedback
    • 2.4 Law Enforcement Relevance and Impact
    • 2.5 Dissemination of Research Results
    • 2.6 Next Steps
  • 3 Introduction
  • 4 Data Sets and Data Collections 17
    • 4.1 GE Global Research Collection
    • 4.2 Airport and “Behave” Data
    • 4.3 Mock Prison Riot Data
      • 4.3.1 Venue
      • 4.3.2 Installation
      • 4.3.3 Camera Views
      • 4.3.4 Calibration
  • 5 Motion and Crowd Pattern Analysis 25
    • 5.1 Multi-camera Multi-target Tracking
    • 5.2 Detection and Tracking of Motion Groups
    • 5.3 Counting and Crowd Detection
    • 5.4 Simple Group-Level Events
    • 5.5 Group Interaction Model
    • 5.6 Group Formation and Dispersion
    • 5.7 Agitation and Fighting
    • 5.8 Advanced Aggression Detection
      • 5.8.1 Feature Tracking
      • 5.8.2 Motion Analysis
      • 5.8.3 Motion Classification and Clustering
      • 5.8.4 Results
  • 6 Identity Management
    • 6.1 PTZ Camera Control
      • 6.1.1 Introduction
      • 6.1.2 Related Work
      • 6.1.3 Experiments
      • 6.1.4 Discussions
    • 6.2 Identity Maintenance
  • 7 Social Network Estimation
    • 7.1 Introduction
    • 7.2 Experiments
    • 7.3 Conclusions
  • 8 Data Collection and System Testing at Mock Prison Riot 2009
    • 8.1 Collection and Testing Approach
    • 8.2 IRB Approval
    • 8.3 Collected Video Data
    • 8.4 Mock Prison Riot Detection and Tracking
    • 8.5 PTZ Control
    • 8.6 Behavior and Event Recognition
      • 8.6.1 Meeting / Approaching / Contraband Exchange
      • 8.6.2 Aggression Detection
      • 8.6.3 Fast Movement
      • 8.6.4 Distinct Group Detection
      • 8.6.5 Flanking Detection
      • 8.7 Performance Evaluation
      • 8.7.1 Sequence “Utah Leader Attack” (Nr. 00)
      • 8.7.2 Sequence “Utah Leader Attack 2” (Nr. 01)
      • 8.7.3 Sequence “Gang Killing other Gang” (Nr. 02)
      • 8.7.4 Sequence “Gang Killing other Gang 2” (Nr. 03)
      • 8.7.5 Sequence “Gang Killing other Gang 3 – Unrehearsed” (Nr. 04)
      • 8.7.6 Sequence “Aborted Attack” (Nr. 05)
      • 8.7.7 Sequence “Aborted Attack 2” (Nr. 06)
      • 8.7.8 Sequence “Gang Argument – Prisoners get attacked” (Nr. 07)
      • 8.7.9 Sequence “Gang Initiation” (Nr. 08)
      • 8.7.10 Sequence “Contraband Exchange” (Nr. 09)
      • 8.7.11 Sequence “Multiple Contraband Exchange” (Nr. 10)
      • 8.7.12 Sequence “Contraband with Fight” (Nr. 11)
      • 8.7.13 Sequence “Blended Transaction” (Nr. 12)
      • 8.7.14 Sequence “Shanking followed by Leaving” (Nr. 13)
      • 8.7.15 Sequence “Gang Hanging Out Followed By Several Fights” (Nr. 14)
      • 8.7.16 Sequence “Fight Followed by Guards Leading Offender Off” (Nr. 15)
      • 8.7.17 Sequence “Fight Followed by Guards Leading Offender Off” (Nr. 16)
      • 8.7.18 Sequence “Contraband – Officer Notices” (Nr. 17)
      • 8.7.19 Sequence “Argument Between Gangs – Officer Assault” (Nr. 18)
      • 8.7.20 Sequence “Contraband exchange followed by guard searching inmates” (Nr. 19)
      • 8.7.21 Sequence “Prisoner being attacked and guard intervening” (Nr. 20)
      • 8.7.22 Sequence “Fight breaking out between gang members and officers breaking it up” (Nr. 21)
      • 8.7.23 Sequence “Fight between gangs. Guards breaking fight up” (Nr. 22)
      • 8.7.24 Sequence “Fight between gangs. Guards breaking fight up” (Nr. 23)
      • 8.7.25 Sequence “Gangs fighting. Guards breaking fight up.” (Nr. 24)
  • A Public Dissemination
  • B Reviews and Meetings
    • B.1 Technical Working Group Meeting
    • B.2 Kick-Off Meeting at NIJ
    • B.3 Sensor and Surveillance Center of Excellence Visit
    • B.4 2008 Technologies for Critical Incident Preparedness Expo (TCIP)
    • B.5 Mock Prison Riot 2009
    • B.6 IEEE Conference on Computer Vision 2009
  • C Mock Prison Riot Data
    • C.1 Data Recorded while Processing
    • C.2 Sequences Processed in Detail
    • C.3 Data Recorded without Processing
  • D Techinical Details of the PTZ Camera Control
    • D.1 Problem Formulation
    • D.2 Objective Function
      • D.2.1 Quality Measures
      • D.2.2 Quality Objective
      • D.2.3 Temporal Quality Decay
    • D.3 Optimization
      • D.3.1 Asynchronous Optimization
      • D.3.2 Combinatorial Search
  • E Techinical Details of Social Network Analysis 110
    • E.1 Building Social Network
      • E.1.1 Face-to-Track Association via Graph-Cut
    • E.2 Discovering Community Structure via Modularity-Cut
      • E.2.1 Dividing into Two Social Groups
      • E.2.2 Dividing into Multiple Social Groups
    • E.2.3 Eigen-Leaders

 

11.09.2011 Analysis, homeland security intelligence, INSA, intelligence Comments Off on INSA recommends actions for improving homeland security intelligence

INSA recommends actions for improving homeland security intelligence

Today is the ten-year anniversary of 9/11/2001. As we honor the sacrifies of those who perished in the horrible events of that day, we must remember that it is is really what we do EVERY DAY that will help to prevent future attacks on our soil, or the soil of our friends and allies. At the heart of issue is “homeland security intelligence”…the information and data that we will need to deter, detect, and disrupt the activities of those to wish us harm.

On September 7, 2011, the Intelligence and National Security Alliance’s (INSA) Homeland Security Intelligence Council (HSIC) released a white paper proposing significant recommendations for Homeland Security Intelligence. The white paper, entitled “Intelligence to Protect the Homeland: Taking stock ten years later and looking ahead,” examines what has been learned in the last ten years and what will be needed from the intelligence community to protect the homeland from future attacks. My friend and colleague Joe Rozek, who chairs the INSA Homeland Security Intelligence Council, commented:

“The HSIC has worked tirelessly for months and today we are proud to offer our analysis and recommendations to the intelligence and homeland security communities on the 10 year anniversary of September 11th, Homeland Security Intelligence is a discipline that will depend on the successful fusion of foreign and domestic intelligence to produce the kind of actionable intelligence to protect the homeland. Yet, underpinning our analysis is the foundational principle that respect for privacy and civil liberties is an inherent, inseparable part of our national security and core values as a nation.”
 

The major recommendations in the HSIC white paper include:

  1. Adopting a common definition for Homeland Security Intelligence (HSI) to better facilitate its collection, analysis, and use in decision making, as well as development as a discipline;
  2. Departing from the “command and control” or top-down hierarchical model and moving toward an integrated enterprise characterized by coordination of intelligence and analysis efforts among federal, state, local, tribal law enforcement and intelligence agencies;
  3. As appropriate, bringing select private sector partners into the enterprise; and
  4. Ensuring the protection of Americans’ privacy and civil liberties through widely applicable training and accountability standards to ensure lawful and aggressive detection and deterrence of terrorist operations in the U.S.

Take some time to read this white paper and I’d be interested in your comments and thoughts…r/Chuck

29.12.2009 Analysis, Data, data sharing, Open Government, transparency Comments Off on Data.gov needs some “Tough Love” if it’s to be successful

Data.gov needs some “Tough Love” if it’s to be successful

I just finished commenting on Data.gov on the NIEM LinkedIn Group and thought I would share what I wrote here on my blog.

I just finished watching a rerun episode of Tough Love on VH1 and I know some of you will think this is a bit odd, but the show led me to some thoughts about how to give the Data.gov project some focus and priority.

You’re probably wondering what Data.gov has to do with eight beautiful women looking for marriage and long-lasting love, but believe it or not, the show and Data.gov have a lot in common.

In this particular episode of the show, the “boot camp” director was focusing on communication skills. He made it very clear to the ladies that communication is very important in making a good first impression with a would be suitor. In the show he counseled the ladies that if they wanted to make a good impression, the ladies would need to:

  • Listen carefully to what their date is telling them about what’s important to them;
  • Make the conversation about “them” on first contact and avoid bragging about yourself; and
  • Resist the urge to reveal too much information about their own respective private lives.

While I will avoid speaking to the validity of this counsel as it applies to love, I would like to suggest that these three rules are also quite relevant in our efforts to have a more transparent, open and collaborative government.

Along these lines, I offer the following three suggestions for Data.gov’s first (transparent, open and collaborative) date with America:

  1. Ask the public (and Congress) what they specifically want to see on Data.gov and the forthcoming dashboard; all apologies to Aneesh Chopra and Vivek Kundra, but I do not believe (as they spoke in the December 8th webcast) that citizens really care much about things like average airline delay times, visa application wait times, or who visited the Whitehouse yesterday. I particualry suggest they work with Congressional Oversight Committees to make Data.gov a tool that Congress can (and will) use.
  2. Make Data.gov about demonstrating the good things that Federal agencies do that directly impact the general public. It’s no surprise that most agencies do a poor job of explaining to citizens what they do. I suggest reviving the OMB Performance Assessment Rating Tool (PART) Program (which appears to have died on the vine with the new administration) and use the performance measures in the Program Results/Accountability section to better communicate the relevant value these agencies deliver to citizens.
  3. Focus Data.gov data sources and the desire for openness on the critical few measures and metrics that matter to the public. Avoid the urge to just “get the data posted” – not many people will care about how many kilowatt hours of hydroelectric power the Bureau of Reclamation is counting, how many FOIA requests the Department of Justice received, or the Toxic Release Inventory for the Mariana Islands. Information sharing is most successful when it is directly relevant with the person (or agency)with whom you are sharing.

I’ll let you know if the next episode is as enlightening as this was. 😉

r/Chuck

28.12.2009 Analysis, Budget, Data, Information sharing, transparency Comments Off on Data.gov CONOP: Nice document, but fails to address non-technical issues affecting transparency

Data.gov CONOP: Nice document, but fails to address non-technical issues affecting transparency

I just took a look at the OMB Data.Gov Concept of Operations, and while I don’t want to sound like a party pooper, but I am very concerned about the Data.gov effort. We appear to be moving full speed ahead with the technical aspect of making data available on data.gov without really thinking through the policy, politics, resource, and other non-technical aspects of the project that could really hurt what could be a very valuable resource.

A few concerns I have include:

1. None of the Data.gov principles in the CONOP address the “real-world effects” we hope to achieve through data.gov–from an operational programs perspective. All seven principles in the CONOP address “internal” activities (means). We need to address success in terms of what citizens will realize through the Data.gov effort.

2. The entire Data.gov effort appears to be driven out of context from any government performance planning and evaluation process. Shouldn’t the need for data transparency be driven by specific strategic management questions?  Where are the links to the President’s Management Agenda? Agency strategic plans?

3. There are more than 200 Congressional Committees with varying degrees of oversight of over a similar number of agencies in the Executive Branch. How will Data.gov impact Congress’ efforts to monitor (oversee) agency performance? What will happen when there is a disparity between a) what an agency says it’s doing, b) what oversight committee(s) say they are doing, and c) how the public views that agency’s performance based on data posted on Data.gov?

4. Transparency, Participation and Collaboration (TPC) are the buzz words of the month, but what does that really mean?  The opening sentence of the CONOP states “Data.gov is a flagship Administration initiative intended to allow the public to easily find, access, understand, and use data that are generated by the Federal government.” Do we really expect the general public to access and analyze the data at Data.gov? If so, do we really understand how the public will want to see/access the information? More importantly, are we (agencies) fully prepared to digest and respond to received public feedback?

5. Who will pay the agencies to support data transparency? Do we really understand the burden involved in achieving open government? The last thing federal agencies need is another unfunded mandate.

6. Finally, how do we know the data that’s made accessible via Data.gov is good data (correct)? The GPRA required OIG review and certification of agency data published in annual performance reports. What can we expect in the way of quality from near-real-time access to agency performance data? Will we require the same data quality process for data feeds posted on Data.gov? Will agencies be funded to do it right? 

I provide similar commentary on this issue and an analysis of the recent Executive Order in a December 17th blog posting here: https://www.nowheretohide.org/2009/12/17/open-government-directive-another-ambiguous-unfunded-and-edental-mandate/

Don’t get me wrong, I am all for open government, but let’s do it right. Let’s give the techies a couple of days off and let’s take a good hard look at the non-technical issues that could really hurt this effort if they’re not properly addressed.

Your comments and thoughts welcomed.

Thanks…r/Chuck

17.12.2009 Analysis, data sharing, Information sharing, Open Government 1 Comment

Open Government Directive: Another ambiguous, unfunded, and edental mandate?

whitehouse logoBefore you send me hate mail let me state that I am all for Federal agencies sharing data in the sprit of open government, but we have to do it smart way, making sure that:

  1. We fully understand why we want it and are clear about what we are really asking for;
  2. We understand the burden involved in achieving open government and that we fund the agencies to do it right;
  3. We are clear about the performance questions that we want the [transparent] data to answer;
  4. We have an understanding for how the public will want to see/access the information; and
  5. We are fully prepared to digest and respond to received public feedback .

After reading the 3,185 words of the Office of Management and Budget (OMB) Open Government Directive (with attachment), I am very sorry to report that IMO none of the five critiera (conditions) listed above have been met by the language contained in the document. From what I read:

  • It would appear that no one in the approval chain asked any hard questions about the language–much of the language used is very vague and leaves a lot of room for interpretation (or misinterpretation);
  • There is no mention of how agencies will be funded to build the capacity to meet the additional workload that the requirements of the memorandum are certain to cause.
  • The focus of the document to “get agency data on the web” and “solicit (direct) public feedback” appears to be totally out of context of any other strategic management, performance assessment, or planning framework.  This appears to ba an end-run around other oversight committees and organizations, like Congress. Will Federal agencies be able to deal with direct feedback from hundreds or thousands of citizens? I am reminded of the old adage “be careful what you ask for”…;
  • The document tells agencies to “publish information online in an open format that can be retrieved, downloaded, indexed, and searched by commonly used web search applications;” however, this can be satisfied in many ways–.txt, .csv, .doc, .pdf, .html,.xml, etc.–some formats will make it very cumbersome for the “public” to view, analyze and understand the data.
  • Finally, the memorandum sets what I believe to be some very unrealistic expectations from both a performance and timeline perspective. For example, how can agencies be expected to review and respond to public input from the web when these same agencies are already overwhelmed with their current day-to-day tasks?

Here are a couple examples to ponder:

On Page 2 – “To increase accountability, promote informed participation by the public, and create economic opportunity, each agency shall take prompt steps to expand access to information by making it available online in open formats”

  • Nowhere in the memorandum are the terms “accountability” or “informed participation” defined
  • What does “create economic opportunity” really mean?
  • It would appear that this mandate circumvents established management processes for holding Federal agencies accountable for efficient and effective performance? (OMB,GAO, Congress)

On Page 3 – “Each agency shall respond to public input received on its Open Government Webpage on a regular basis…Each agency with a significant pending backlog of outstanding Freedom of Information requests shall take steps to reduce any such backlog by ten percent each year.”

  • What do the mean by “respond to public feedback on a regular basis?”
  • All feedback? Some feedback?
  • What does “regular basis” mean? Within 24 hours? Weekly? Annually?

If we really want Federal agencies to be more “open” with their data and information, we must be willing to commit the effort required to:

  • Be clear about what we really want them to do;
  • Give them the funding to do it right;
  • Drive data openness with specific questions we want answered;
  • Present the data in a way that the public can easily understand it; and
  • Be ready and willing to act on the feedback we’re sure to receive.
  •  

    What are your thoughts and comments on this issue?

    Thanks…r/Chuck

    30.07.2009 Analysis, CJIS, data sharing, fusion center, intelligence center, Law enforcement information sharing, public safety 1 Comment

    Portal-mania: They’re reproducing like bunnies, but they ain’t as cute

    I had a conversation with a fusion center director yesterday about portals that really drove home a feeling I had about the recent plethora (read: boatload) of portals that the average analyst person supporting public safety and homeland security has to login to in order to do their jobs. 

    I’m paraphrasing a bit, but he basically indicated that the state, local, and private sector organizations in his state told him that they “DO NOT want to have to log into multiple portals” to stay informed about criminal and terrorism threats to their state’s  infrastructure.” 

    When you take a closer look at the “Portal-mania” that exists, it seems that every agency and multiple programs within a single agency has to have their own portal for accessing the information and analytic tools that agency or program provides; here’s a quick list of ones I am familar with, (feel free to email me the names of others you know about):

    1. DHS HSIN State and Local Community of Interest (SLIC)
    2. DHS Lessons Learned Information Sharing (LLIS)
    3. DHS Automated Critical Asset Management System (ACAMS)
    4. DOJ Regional Data Exchange (R-DEx)
    5. DOJ National Data Exchange (N-DEx)
    6. DOJ eGuardian
    7. DOJ Law Enforcement Online (LEO)
    8. DOJ InfraGard
    9. DOJ National Sex Offender Public Website (NSOPW)
    10. DOJ National Criminal Intelligence Resource Center (NCIRC)
    11. DOJ Regional information Sharing System (RISS)
    12. Private Sector CyberCop
    13. [State] Criminal Justice Information System (CJIS)
    14. …add to this Department of the Treasury, Department of Transportation, and other federal agency portals
    15. …and about three-dozen other databases and private sector websites

    This is nutz! Dedicated portals are so 1990’s…we should be able to use the same technology I used to create this website and blog (WordPress and four different plug-in widgets) to make information and advanced analytic capabilities available to Fusion Centers and other public safety users.  I would like to challenge the agencies and programs listed above to make the information and capabilities they offer available  through widgets, web-parts, and gadgets that Fusion Centers and other intelligence/information sharing users can integrate into THEIR portal of choice. 

    Whether it’s SharePoint, Oracle, or IBM Websphere, state, local, or private sector organizations should be able to pick and integrate into THEIR selected portal environment from the portal list above the information and capabilities that they need to do their job–they should not have to access the multiple, stovepiped portals as they do today.

    I’d like to know what you think about this…Thanks..r/Chuck Georgo