Comparing how state legislatures make their data publicly available.
For more context, read this post and see our methodology.
(Note: Since the publication of this report card several states have come to us with additional information or made changes that would affect their score. Details are available below.)
State | Completeness | Timeliness | Ease of Access | Machine Readability | Standards | Permanence | Grade |
---|---|---|---|---|---|---|---|
Connecticut | 0 | 1 | 0 | 1 | 0 | 2 | A |
Georgia | 0 | 0 | 0 | 2 | 0 | 2 | A |
Kansas | 0 | 0 | 1 | 1 | 1 | 2 | A |
New Hampshire | 0 | 0 | 0 | 2 | 0 | 2 | A |
North Carolina | 0 | 1 | 0 | 1 | 0 | 2 | A |
Texas | -1 | 1 | 1 | 2 | 0 | 2 | A |
Washington | 0 | 1 | 0 | 2 | 0 | 2 | A |
Alaska | 0 | 1 | 0 | 0 | 0 | 2 | B |
Arkansas | 0 | 1 | 0 | 1 | 0 | 2 | A |
Maryland | 0 | 1 | 0 | 1 | 0 | 1 | B |
Mississippi | 0 | 0 | 0 | 1 | 0 | 2 | B |
Nevada | 0 | 1 | 0 | 0 | 0 | 2 | B |
New Jersey | 0 | 0 | -1 | 2 | 0 | 2 | B |
New York | 0 | 1 | 0 | 1 | 0 | 2 | A |
Ohio | 0 | 1 | -1 | 1 | 0 | 2 | B |
Utah | 0 | 0 | 1 | 0 | 0 | 2 | B |
Vermont | 0 | 1 | 0 | 0 | 0 | 2 | B |
West Virginia | 0 | 1 | 1 | -1 | 0 | 2 | B |
Arizona | 0 | 0 | -1 | 0 | 0 | 2 | C |
Delaware | 0 | 1 | -1 | 0 | 0 | 2 | C |
District of Columbia | 0 | 1 | -1 | -1 | 0 | 1 | D |
Florida | 0 | 1 | 0 | -1 | 0 | 2 | C |
Hawaii | -1 | 0 | 0 | 0 | 0 | 2 | C |
Idaho | 0 | 0 | 0 | 0 | 0 | 1 | C |
Illinois | 0 | 1 | 0 | -1 | 0 | 2 | C |
Iowa | 0 | 1 | 0 | -1 | 0 | 2 | C |
Michigan | 0 | 1 | 0 | 1 | 0 | 0 | C |
Minnesota | -1 | 1 | 0 | 0 | 0 | 2 | C |
Missouri | 0 | 0 | 0 | -1 | 0 | 2 | C |
Montana | 0 | 1 | 0 | -1 | 0 | 2 | C |
New Mexico | 0 | 0 | 0 | -1 | 0 | 2 | C |
North Dakota | 0 | 0 | 0 | -1 | 0 | 2 | C |
Oregon | 0 | -1 | 0 | 0 | 0 | 2 | C |
Pennsylvania | 0 | 1 | 1 | 1 | 0 | 2 | A |
South Carolina | 0 | 0 | 0 | 0 | 0 | 2 | C |
South Dakota | 0 | 1 | 0 | 0 | 0 | 2 | B |
Tennessee | 0 | 1 | 0 | 0 | 0 | 0 | C |
Virginia | 0 | 1 | 0 | 1 | 0 | 2 | A |
Wyoming | 0 | 0 | 0 | 0 | 0 | 2 | C |
California | 0 | 0 | -1 | 1 | 0 | 0 | D |
Indiana | -1 | 1 | 0 | -1 | 0 | 0 | D |
Louisiana | 0 | 1 | -1 | -1 | 0 | 0 | D |
Maine | 0 | 1 | -1 | 0 | 0 | 0 | D |
Oklahoma | 0 | 1 | -1 | 0 | 0 | 0 | D |
Wisconsin | 0 | 0 | 0 | 0 | 0 | 0 | D |
Alabama | 0 | 1 | -2 | -1 | 0 | -1 | F |
Colorado | 0 | 1 | 0 | -1 | 0 | 1 | C |
Kentucky | 0 | 0 | 0 | -2 | -1 | 0 | F |
Massachusetts | -1 | 1 | -2 | -2 | 0 | -1 | F |
Nebraska | 0 | 0 | 0 | -1 | 0 | -1 | F |
Rhode Island | 0 | 1 | 0 | 0 | 0 | -1 | D |
Each state was evaluated in six categories based largely on the Ten Principles For Opening Up Government Information. Each score is based on at least two members of staff and a volunteer during our state survey. Additionally, state legislatures were contacted (unless noted in their score) to ensure that our information on bulk data availability and timeliness was as accurate as possible.
The specific criteria for each category are as follows:
We evaluated each state on the data collected by Open States: bills, legislators, committees, votes and events. We also took note if a state went above and beyond to provide this information and other relevant contextual information such as supporting documents, legislative journals and schedules. Points were deducted for missing data, often roll call votes.
Legislative information is most relevant when it happens, and many states are publishing information in real time. Unfortunately, there are also states where updates are more infrequent and showing up days after a legislative action took place. States were dinged if data took more than 48 hours to go online.
Common web technologies such as Flash or JavaScript can cause problems when reviewing legislative data. We found that the majority of sites work fairly well without JavaScript, but some received lower scores due to being extremely difficult to navigate, impossible to bookmark bills, and in extreme cases, completely unusable.
For many sites, the Open States team wrote scrapers to collect legislative information from the website code—a slow, tedious and error prone process. We collected data faster and more reliably when data was provided in a machine-readable format such as XML, JSON, CSV or via bulk downloads. If a state posted PDF image files or scanned documents, it received the lowest score possible.
Because our ability to access most of a state’s data is represented by the above “Machine Readability” metric, we decided to use this provision to measure how a state made their bill text available. Making text available in HTML or PDF is the norm, and was considered an acceptable commonly owned standard (PDFs are a commonly owned standard, but it would be certainly nice to see alternative options where bill text is only available via PDF). States that only make documents available in Microsoft Word or Wordperfect formats require an individual to purchase expensive software or rely on free alternatives that may not preserve the correct formatting. It is worth noting, all states except for two met the common criteria of providing HTML and/or PDF only, one state (Kansas) went above and beyond and another (Kentucky) did not even meet this threshold.
Many states move or remove information when a new session starts, much to the dismay of citizens seeking information on old proposals and researchers that may have cited a link (e.g. https://somelegislature.gov/HB1 vs https://somelegislature.gov/2011/HB1) only to see it point to a different bill in the following session. Tim Berners-Lee, inventor of the World Wide Web, wrote an article declaring Cool URIs Don’t Change and we agree.
This poses a particular challenge to us since every page on OpenStates.org points to the page we collected data from, but if a state changes their site then users lose the ability to check us against the original source. Most (but not all) states are good about at least preserving bill information, but few were equally as good about preserving information about out-of-office legislators and historical committees, equally important parts of the legislative process.
Since the initial publication of this report card on March 11th, 2013 some states have provided us with additional information or made changes in response that have affected their score. These changes are reflected below and noted on the report card itself.
Rhode Island - On March 12th, 2013 we confirmed with Rhode Island IT staff that data is updated in real-time, not weekly as we had initially been told. This information raised their score by 2 points, bringing them into the 'D' class.
New York - On March 12th, 2013 New York Senate staff reached out and clarified their update policy, raising their score by a point and putting them into the 'A' class. A better API was also pointed out to us, which may affect their machine readability score in the future.
Virginia - On March 22nd, 2013 Virginia Legislature staff clarified their update policy, raising their score by a point and putting them into the 'A' class. There is also potential that a coming change to their data availability will raise their score further.
Colorado - On March 22nd, 2013 it was determined via discussion with an IT manager from the Colorado Legislature that the bills that went missing were site errors and no actual data was affected. This made a large change in Colorado's Timeliness score, raising Colorado from an F to a C.
Pennsylvania - On December 4th, 2013 we evaluated the new Pennsylvania website that was unveiled in November 2013. This resulted in an increase in the timeliness, ease of access, and machine readability scores.
District of Columbia - On March 24, 2015 we evaluated the new District of Columbia website. The result was a decrease in ease of access and machine readability scores, and an increase in timeliness, lowering DC from a C to a D.