Data

02.03.2013 Budget, congress, criminal justice, Data, data sharing, Information sharing, justice, law enforcement, Law enforcement information sharing, leadership, LEIS, N-DEx, NIEM Comments Off on Letter to Congressman Reichert: If you want LE information sharing, please aim your pen at a different target

Letter to Congressman Reichert: If you want LE information sharing, please aim your pen at a different target

If you want law enforcement agencies to share information, go to the source and help the Chiefs and Sheriffs to push their data in the FBI’s National Data Exchange N-DEx. Trying to impose information sharing with unfunded standards mandates will not work.

As someone who has been in the standards business since 1995, history has proven to me that:

  • The business need must drive standards, standards can NEVER drive the business; and
  • Trying to SELL the business on standards is a losing strategy.

Hi Congressman Reichert,

You won’t remember me, but a long time ago we were in meetings together in Seattle with the likes of John McKay, Dave Brandt, Scott Jacobs, Dale Watson, and others working on building the Law Enforcement Information Exchange (LInX); I was the technical guy on the project, working with Chief Pat Lee and our very dear lost friend Julie Fisher (may she rest-in-peace, I sure miss her).

A hell of a lot of water has gone under the bridge since then–it’s been nearly TWELVE YEARS. If we look back over this time, we have had so many bills, laws, strategies, policies, papers, speeches, conferences, proclamations, and other assorted attempts to prod law enforcement data loose from the nearly 18,000 agencies across our country. While we are far better off than we were back then, I think we can agree that we still have a long way to go.

Where we differ, I’m afraid, is in the approach to get there – a few days ago, you proposed legislation, the Department of Justice Global Advisory Committee Authorization Act of 2013, as a means to improve information sharing among law enforcement agencies – do we really believe another “stick” will work to get agencies to share information? Do we really believe it’s a technology or data standards problem that’s preventing law enforcement data from being shared? As a technologist for 34 years, and someone who has been involved in law enforcement information sharing since the Gateway Project in St. Louis, MO in 1999, I can tell you it is neither.

While I applaud the work of the GAC, and I have many colleagues who participate in its work, I’m afraid having more meetings about information sharing, developing more standards, approving more legislation, and printing more paper will NOT help to reach the level of information sharing we all want.

Instead, I want to propose to you a solution aimed at capturing the commitment of the men and women who can actually make law enforcement information sharing happen, and virtually overnight (metaphorically speaking) – namely, the great men and women who lead our police and sheriffs departments across America.

Now to be fair, many of these agencies are already contributing their records to a system I am sure you are familiar with called the National Data Exchange (N-DEx). Built by the FBI CJIS Division, this system has matured into a pretty respectable platform for not only sharing law enforcement information, but also for helping cops and analysts to do their respective investigative and analytic work.

Now, in case you are wondering, I do not own stock in any of the companies that built N-DEx, nor has the FBI signed me up as a paid informant to market N-DEx. I write to you on my own volition as a result of my nearly six years of volunteer work as a member of the International Association of Chiefs of Police (IACP) Criminal Justice Information Systems (CJIS) Committee.

About two years ago I volunteered to lead a small sub-group of the committee who have either built, led, or managed municipal, state, federal, or regional information sharing systems. Our charge was (and still is) to help CJIS take a look under the hood of N-DEx to see what’s in there (data wise) and to help figure out what needs to be done to make it a more effective tool to help cops across America catch more criminals, and maybe, just maybe, even prevent criminals from acting in the first place.

While our work is far from done, I can tell you that one thing we need is more data – as you well know, be it N-DEx, LInX, RAIN, or any other information sharing system, it is only as good as the data that’s put into it.

Believe it or not we already have the data standards in-place to get the data into N-DEx. CJIS has developed two Information Exchange Packet Descriptions (IEPDs) that tells agencies exactly what to do and how to format and package up their data so it can get to N-DEx. Additionally, CJIS has an extensive team ready to assist and my colleagues over at the IJIS Institute hold training sessions sponsored by BJA, to help agencies along the process (NIEM training).

These two IEPDs can help law enforcement agencies today to share the following law enforcement records:

  • Service Call
  • Incident
  • Arrest
  • Missing Person
  • Warrant Investigation
  • Booking
  • Holding
  • Incarceration
  • Pre-Trial Investigation
  • Pre-Sent Investigation
  • Supervised Release

So what’s the hold up? Speaking only for myself, and I will be very straight with you, I believe the root cause for not getting more law enforcement data into N-DEx is the current piecemeal, politically charged, hit and miss grant funding process that the Act you propose, if passed, will burden even further – see page 3, lines 17-25 and page 4, lines 1-6.

Instead, I ask that you please answer the following question…

If law enforcement information sharing is important enough to push though a Public Act, where is the nationwide project, with funding, to get all shareable law enforcement data loaded into the one system that would give ALL law enforcement officers and analysts access to collective knowledge of the nearly 18,000 law enforcement agencies?

The immediate answer might be “we already have one; N-DEx;” however, N-DEx is only a piece of the answer…it’s as they say, “one hand clapping.” And in all fairness to my friends and colleagues at the FBI CJIS Division, that program was only charged and funded to build the  N-DEx bucket, they were never funded to actually go get the data to fill the bucket.

The strategy, for whatever reason back then, was relegated to a “build it and they will come” approach, that IMHO has not worked very well so far and may take another 5-10 years to work. I should also note that the bucket isn’t totally empty…there are quite a number of agencies and regional projects, like LInX, that have stepped up and are helping to fill the bucket – however, if we want to expedite filling up the bucket, focusing on mandating more standards is not the answer

What I submit  is the “other hand clapping” is the need for a shift focus, away from policy, standards, and technology, and establish a funded nationwide project that will offer a menu of choices and support packages to the Chiefs and Sheriffs that will enable them to start sending as many of their shareable records as possible to N-DEx.

Some of the options/support packages could include:

  1. Provide direct funding to agencies and regional information sharing systems to develop N-DEx conformant data feeds to N-DEx;
  2. Grant direct funding to RMS and CAD system providers to develop N-DEx conformant data feeds from their software, with the stipulation they must offer the capability at no additional cost to agencies that use their products;
  3. Establish a law enforcement data mapping assistance center, either bolted on to IJIS NIEM Help Desk, as an extension of NLETS menu of services, or through funding support at an existing information sharing project like the Law Enforcement Technology, Training, & Research Center who works in partnership with the University of Central Florida.

At the end of the day, we all know that the safety and effectiveness of law enforcement is greatly affected by the information he or she has at their fingertips when responding to that call.

Do you really want to leave it to chance that that officer’s life is taken, or a criminal  or terrorist is let go because his or her agency wasn’t “lucky enough” to win the grant lottery that year?

So, let’s empower the single most powerful force that can make sure the information is available – the Sheriff or Chief leading that agency. Let’s stop with the unfunded mandates, laws, standards, studies, point papers, etc., and let’s finally put a project in-place with the funding necessary to make it happen.

v/r

Chuck Georgo,

Executive Director
NOWHERETOHIDE.ORG
chuck@nowheretohide.org

05.04.2012 Data, data sharing, JIEM, Law enforcement information sharing, LEIS, NIEM Comments Off on What’s standing in the way of NIEM adoption?

What’s standing in the way of NIEM adoption?

I posted this response to a question on the LinkedIn NIEM group where someone asked about the slow rate of NIEM adoption; i thought i would cros post my response here.

What’s standing in the way of NIEM adoption?It’s about leadership.

  • It’s about reducing complexity.
  • It’s about getting the word out.
  • It’s about opening up proprietary protocols.
  • It’s about conformance.
  • It’s even about standards.

What’s really standing in the way? Two things…a) utility and b) a market for it.

I think it would also be wise for us to take a few pages out of the eCommerce, EDI, EBXML world (and honestly, the internet as a whole). EDI became a standard because large companies said “if you want to do business with me, then you will stop faxing me POs and start sending them to me in this new thing called EDI. When XML appeared on the scene the same companies converted their information exchanges to EBXML and vendors and service providers folllowed suit – one might say if it weren’t for the EDI-to-EBXML movement, we might not even be talking about GJXDM or NIEM today; EBXML was groundbreaking in ts day.

So what’s in the way? I’ll look at this in terms of two things I mentioned above:

  1. Utility of NIEM– The “technology Acceptance Model” tells us that for increased adoption of a technology, it must be “useful and easy to use.” Today, however, we are having difficulty getting people to see the utility of NIEM, and it certainly has not proven itself to be easy to use either. Now, to be fair, NIEM started off life as dictionary of common data elements (words if you will) with working views of syntax and semantics. Then we have IEPDs. These are like sentences strung together, and by different authors, and the difficulty is that we don’t have a good way to know how well those sentences are strung together, whether or not we can assemble those sentences into comprehensible paragraphs, or even what “stories”(and where in those stories) those sentences might belong. In other words, I don’t think NIEM is coupled tightly enough to the business processes of Justice and Public Safety agencies. To become more useful, we must dust off JIEM, revalidate the inventory of Justice exchanges, and specifically tie them to NIEM IEPDs. And while we do this, we must clean up the inventory of IEPDs, remove ones that are toublesome, and reference those IEPDs back to the Justice business process and exchanges in JIEM.
  2. A Market for NIEM– Unfortunately, reuse has NOT always result in cost savings. There are a number of examples where agencies have had bad experiences with implementing NIEM, whether it was because of lack of skill of the integrator, poor IEPD design, poor project planning, immature integration tools, or good old politics, saying an agency will save money by using NIEM is not a strong position right now. To resolve issue (after we join NIEM and JIEM) I think we must attack the problem at the root–in the technology acquisition process. Stop with the buttons and bumper stickers and neat shirts (I have one too). What we need to do is drive NIEM use through RFPs and contracting processes. Of course, we have to first clean up the clearinghoue, but then we must help agencies to craft RFP language that they can use to call for use of NIEM and “NIEM-enabled” web services to effect the information exchanges called out in the business processes to be suported by the new technoogy acquisition. While some vendors have demonstrated leadership in this area, the real driver (in a free market economy) is the contracting process–vendors will invest in their ability to adopt and integrate NIEM if it’s in their financial interest to do so–they do have payrolls to meet and investors to keep happy. Some shining light in this quest is also the effort by IJIS and others who are working hard to establish a “standards-like” effort to clean up IEPDs and to help vendors demonstrate conformance (or compliance) to those standards for their products.

Your comments/thoughts are welcomed….r/Chuck

29.12.2009 Analysis, Data, data sharing, Open Government, transparency Comments Off on Data.gov needs some “Tough Love” if it’s to be successful

Data.gov needs some “Tough Love” if it’s to be successful

I just finished commenting on Data.gov on the NIEM LinkedIn Group and thought I would share what I wrote here on my blog.

I just finished watching a rerun episode of Tough Love on VH1 and I know some of you will think this is a bit odd, but the show led me to some thoughts about how to give the Data.gov project some focus and priority.

You’re probably wondering what Data.gov has to do with eight beautiful women looking for marriage and long-lasting love, but believe it or not, the show and Data.gov have a lot in common.

In this particular episode of the show, the “boot camp” director was focusing on communication skills. He made it very clear to the ladies that communication is very important in making a good first impression with a would be suitor. In the show he counseled the ladies that if they wanted to make a good impression, the ladies would need to:

  • Listen carefully to what their date is telling them about what’s important to them;
  • Make the conversation about “them” on first contact and avoid bragging about yourself; and
  • Resist the urge to reveal too much information about their own respective private lives.

While I will avoid speaking to the validity of this counsel as it applies to love, I would like to suggest that these three rules are also quite relevant in our efforts to have a more transparent, open and collaborative government.

Along these lines, I offer the following three suggestions for Data.gov’s first (transparent, open and collaborative) date with America:

  1. Ask the public (and Congress) what they specifically want to see on Data.gov and the forthcoming dashboard; all apologies to Aneesh Chopra and Vivek Kundra, but I do not believe (as they spoke in the December 8th webcast) that citizens really care much about things like average airline delay times, visa application wait times, or who visited the Whitehouse yesterday. I particualry suggest they work with Congressional Oversight Committees to make Data.gov a tool that Congress can (and will) use.
  2. Make Data.gov about demonstrating the good things that Federal agencies do that directly impact the general public. It’s no surprise that most agencies do a poor job of explaining to citizens what they do. I suggest reviving the OMB Performance Assessment Rating Tool (PART) Program (which appears to have died on the vine with the new administration) and use the performance measures in the Program Results/Accountability section to better communicate the relevant value these agencies deliver to citizens.
  3. Focus Data.gov data sources and the desire for openness on the critical few measures and metrics that matter to the public. Avoid the urge to just “get the data posted” – not many people will care about how many kilowatt hours of hydroelectric power the Bureau of Reclamation is counting, how many FOIA requests the Department of Justice received, or the Toxic Release Inventory for the Mariana Islands. Information sharing is most successful when it is directly relevant with the person (or agency)with whom you are sharing.

I’ll let you know if the next episode is as enlightening as this was. 😉

r/Chuck

28.12.2009 Analysis, Budget, Data, Information sharing, transparency Comments Off on Data.gov CONOP: Nice document, but fails to address non-technical issues affecting transparency

Data.gov CONOP: Nice document, but fails to address non-technical issues affecting transparency

I just took a look at the OMB Data.Gov Concept of Operations, and while I don’t want to sound like a party pooper, but I am very concerned about the Data.gov effort. We appear to be moving full speed ahead with the technical aspect of making data available on data.gov without really thinking through the policy, politics, resource, and other non-technical aspects of the project that could really hurt what could be a very valuable resource.

A few concerns I have include:

1. None of the Data.gov principles in the CONOP address the “real-world effects” we hope to achieve through data.gov–from an operational programs perspective. All seven principles in the CONOP address “internal” activities (means). We need to address success in terms of what citizens will realize through the Data.gov effort.

2. The entire Data.gov effort appears to be driven out of context from any government performance planning and evaluation process. Shouldn’t the need for data transparency be driven by specific strategic management questions?  Where are the links to the President’s Management Agenda? Agency strategic plans?

3. There are more than 200 Congressional Committees with varying degrees of oversight of over a similar number of agencies in the Executive Branch. How will Data.gov impact Congress’ efforts to monitor (oversee) agency performance? What will happen when there is a disparity between a) what an agency says it’s doing, b) what oversight committee(s) say they are doing, and c) how the public views that agency’s performance based on data posted on Data.gov?

4. Transparency, Participation and Collaboration (TPC) are the buzz words of the month, but what does that really mean?  The opening sentence of the CONOP states “Data.gov is a flagship Administration initiative intended to allow the public to easily find, access, understand, and use data that are generated by the Federal government.” Do we really expect the general public to access and analyze the data at Data.gov? If so, do we really understand how the public will want to see/access the information? More importantly, are we (agencies) fully prepared to digest and respond to received public feedback?

5. Who will pay the agencies to support data transparency? Do we really understand the burden involved in achieving open government? The last thing federal agencies need is another unfunded mandate.

6. Finally, how do we know the data that’s made accessible via Data.gov is good data (correct)? The GPRA required OIG review and certification of agency data published in annual performance reports. What can we expect in the way of quality from near-real-time access to agency performance data? Will we require the same data quality process for data feeds posted on Data.gov? Will agencies be funded to do it right? 

I provide similar commentary on this issue and an analysis of the recent Executive Order in a December 17th blog posting here: https://www.nowheretohide.org/2009/12/17/open-government-directive-another-ambiguous-unfunded-and-edental-mandate/

Don’t get me wrong, I am all for open government, but let’s do it right. Let’s give the techies a couple of days off and let’s take a good hard look at the non-technical issues that could really hurt this effort if they’re not properly addressed.

Your comments and thoughts welcomed.

Thanks…r/Chuck