skip navigation
Here's how you know US flag signifying that this is a United States Federal Government website

An official website of the United States government

Here's how you know

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.


Secure .gov websites use HTTPS
A lock ( ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Technical road map and software development process

While this study has already discussed the need to employ user centered and agile processes in developing enhancements to the e-filing system and provided many specific recommendations regarding the types of design and infrastructure changes we believe will help both the filer and those that help the filer, while avoiding costs, this section lays out more details of the proposed software development workflow and a rough roadmap for next steps in upgrading the infrastructure.

This recommendation is part of the 2016 E-Filing study.

Evolving software development workflow

Test driven development

Test driven development (TDD) is an agile approach to software development that requires writing unit tests, where a unit is the smallest testable software component, prior or in parallel to writing any functional code; the code is then tested as soon as it is written. Developers write tests and code in small, iterative cycles. TDD is a best practice in modern software because the benefits of testing outweigh the time it takes to write tests. A thoughtful test suite will make sure that key processes in software do not regress as the software evolves.

One of the main benefits of integrated testing is never having the same bug twice. When you make tests for situations that you know have caused problems in the past, you will automatically check to make sure the outputs of that situation are as expected. A wide array of unit tests should check that outputs of functions behave as expected. A targeted amount of integration tests should make sure that processes are functioning correctly together. As a starting point, the FEC could ask internal software testers (such as RAD) for common scenarios that they test when they look at software problems and recreate those tests as automated tests.

For the FEC, a hearty test suite could increase the confidence of vendors. Some vendors do not start using new FEC modules until they have completed their own in-house testing. This means that advancement in the modules, like the improvement to the FEC print module that enable automated processing of paper forms, are not enacted by filers until over a month later.

If there is an open source test suite that proves the integrity of the e-filing modules, vendors can run the tests themselves and have confidence that the regressions they have seen before will not trouble their clients again. Open sourcing these tests could be an opportunity for vendors and the wider community to contribute, since they share incentives to make sure the software components are functioning properly.

Adding a robust test suite makes it easier to make changes to the existing codebase or create a new codebase. Because software changes in a complex data environment, like FEC disclosure systems, can have wide-ranging impacts, tests can create more certainty that unintended consequences will not creep into the code. This can lead to faster development over time. For the Beta FEC project, we also check for things like n+1 queries that can be easy to miss and slow down queries later.

A critical part of testing is to make sure your tests run automatically. Finding bugs early leads to smoother development. One of the best ways to make sure fewer bugs make it to production is to run test as part of the deploy process and only deploy or publish those changes when tests pass.

You can read more granular information about testing in the 18F Automated Testing Playbook.

Moving e-filing data intake to the cloud should begin with implementing an incremental process for each process to be migrated. The first critical step is creating a test suite for core functionality in each unit. Then, the new process will be set up and it will need to pass the same tests as the process it is replacing. Then there needs to be a deeper quality inspection and the results of both processes should be compared. As bugs are uncovered, new tests should be added to the test suite so that problems are not repeated.

Automated workflow management

Cloud Architecture depends on the ability of apps and databases to be reproduced on any server in the cloud system. This automated replacement of apps happens frequently and doesn’t cause downtime because the system does not take the old instance of the app down until the new instance is created. That way, users get an uninterrupted experience. Because cloud systems depend on frequent, automated deploys, cloud deployments encourage consistent configuration management across projects. This can help maintain baseline security controls, requires less maintenance in terms of tracking down and patching individual systems, and encourages good practices like automated deploys contingent on tests passing.

Because the cloud model treats infrastructure like reproducible units that can be moved and recreated smoothly, there is an opportunity to automate the tedious work associated with the deployment process, minimize human error, and give people more time to more high level work.

In workshops and interviews FEC employees brought up the current workflow as a significant pain point. The EFO has made gains in managing its workflow, but the nightly processes and the data system as a whole would benefit from more automated workflow management.

Automating workflows allows for scalability and parallel processing. For example a trigger can be setup to sense high traffic, and add more app instances in response. Having multiple instances can also cut down processing times, and since you pay for what you use in a cloud pricing model, using several instances simultaneously can continue until it is no longer needed, and those instances can be shut down after the jobs are done.

Technical Road Map

As part of this study, the 18F team was asked to make a roadmap to modernize and improve the FEC’s electronic filing platform. The goal of the roadmap is to lay out a plan to build on the FEC’s current strengths and identify where and how to progress.

The roadmap begins with an ordered list of tasks. While the tasks are roughly ordered, some could be done concurrently. The main technical elements of each task are listed in parentheses. Next we talk about ongoing processes that will help the FEC be successful, and we conclude with recommendations for the technical infrastructure (stack) that the FEC should use to complete recommended tasks and process changes.

A. Add a test suite

Test suites are a collection of test cases employed by software developers to ensure the program is functioning as expected. One of the main concerns from FEC staff in redeveloping e-filing is degrading the reliability of the system. Testing is one way to mitigate risk when making changes. Test suite considerations:

  • Start by writing tests for current functionality.
  • Start with writing tests for new features and bug fixes going forward.
  • Add automated tests for things that typically get a manual quality acceptance test.
  • Add tests for bugs that have occurred in the past, to make sure mistakes are not repeated.

B. Improve upload speeds by creating a cloud-based upload API

FEC should create a cloud based submission API that pushes files to Amazon static file storage. Employing cloud architecture for the submission process would allow the FEC to seamlessly add server resources as needed and maintain upload speeds even in times of high demand. Using a static, cloud-based hosting service would also enable files to be uploaded without special processes or FEC-based server delays. One example of this process is in action is the Data Act API, an open source project that has an endpoint that uploads to Amazon S3.

This architecture could also be employed for validating filings, which would enable more parallel processing of validation: several instances could be run at once, and then extra instances could be spun down when they are not in use.

C. Improve data validation

The following changes should be implemented as part of an iterative, human-centered design process in collaboration with users.

  • Rewrite warning and error message text in plain language.
  • Add guidance on how and where to fix the problem to warning and error message text.
  • Add suggestions that help filers know how to enter the right data in the right format.
  • Re-architect FECCheck to enable in-line data validation.
  • Start validating the form fields currently validated by FECCheck in-line.
  • Deliver warning and error messages to filers at the point of data entry.

D. While unifying the existing SQL databases, implement data improvements

  • Unify existing SQL databases to use postgres in cloud architecture. (The upcoming Cloud study will have more details about other cloud migration plans.)
  • While making changes to the current database structure, partition data and triggers to only update changes to the information, rather than recreating the information. {Add links to data recommendations}
  • The FEC could alleviate this data issue by re-architecting the existing e-filing data schema. One short term solution is to keep the existing table and add columns for the different field types (which in turn would violate the database development best practice of normal form). A scalable solution, which will be needed in the long term, is to refactor the table into two tables.

E. Redesign of the print API to include machine readable data elements

  • Add machine readable elements to FECPrint (such as QR codes) and the ability to read these to FECLoad to reduce errors to further improve processing times beyond recent improvements.

F. Make the filing utility platform agnostic*

*How this is achieved depends on the FEC’s decision of whether to create a FECFile web app.

  • Begin with small, achievable goals.
  • First, create a platform agnostic version of a small form, such as the form 2, before moving on to a larger, more commonly used form like the form 3.
  • Design and test each element in collaboration with filers, RAD, and the EFO.
  • Move away from the .dcf file format. This information is better stored in a database.
  • Develop a web app version of FECFile. With open sourced code and API driven infrastructure, a web app will make it easy for vendors to integrate their own database and hosting solutions.

For FECFile users, the storage depends on whether the new app is a web or desktop based app. A web app could be integrated with the postgres database and the data could be encrypted for storage and unencrypted upon submission. This would be the fastest method and allow campaign members to collaborate on the same filing.

If FECFile were to remain a desktop application, it could be built with a local database (like SQLite) incorporated. This would be seamless to the users: they would not need to directly interface with the database, rather, it would function like an app on your phone. Because SQLite is a standard open database format, this would allow the software to perform faster and have the added benefit of making the data easier for developers to integrate with other tools.

Redesigning FECFile presents an opportunity to re-evaluate the .fec file format and potentially move to a more standard format such as .csv or .json, though this change would require changing the upload API as well.

The main issue with a format change away from .fec files would be recognizing the needs of departments other than e-filing. Special attention needs to be paid to make sure that Press and Public Records, and RAD's tools are still operational. Public outreach and public development of these tools will help others adapt to the changes.

G. Improve speeds by migrating static items to a static file hosting service such as Amazon Web Services S3/Glacier

Cloud based hosting services provide convenient storage and delivery of static content such as images and pdfs and data files. In addition to increasing speed, moving to a managed cloud system typically reduces costs. The beta FEC architecture currently uses this architecture, and it has been shown to handle large files well.

H. Review and adjust the plan as needed to accommodate user needs

Ongoing tasks and processes

Create automated tests

  • Testing decreases risk and can increase trust in a product. (Here, testing is listed as the first item, but testing should continue to be part of each step of the process and integrated into the deploying and publishing code.)

Document code

  • Better documented code is easier to maintain and adding functionality or improving the codebase (refactoring) becomes faster and easier. Additional details:
    • Include more in-line explanations of functions.
    • The FEC would also benefit from better documentation of the data as it originates from a form, goes through processing, and is made available to the public.
    • Make sure outputs are publicly documented.

Open source the code base

  • Publish the code as components are developed or redeveloped. Consider beginning with the test suite.
  • Grow FEC's GitHub organization and publish more repositories. GitHub is also a good platform for engaging the developer community that uses FEC data or interfaces with FEC systems.

Move to the cloud

  • As the new version is built out, make 12 factor applications on a scalable cloud platform. This will greatly increase the capacity and resilience of e-filing systems.
    • The FEC could easily use to unite the data systems, because the platform is already being used for the FEC’s new website. Additionally, FEC staff are gaining skills so they will already know how to use this platform. As of January 2017, is on path to receive government acquisition approvals which will help systems incorporating it to achieve many of the necessary compliance steps. Since is open source, and its architecture is similar to other vendors, FEC could also have the flexibility of adapting to a different platform as a service (PAAS) down the line if desirable.

Use language agnostic APIs to enable components to interface with each other

  • The EFO is currently making progress towards language agnostic APIs. This progress should continue. Additionally, making the APIs restful would make it easier for internal and external developers.

Implement changes in an agile, human centered manner

  • It is important to make changes to the plan as user need are discovered and understood, and when unforeseen problems arise.
    • Several members of the FEC have Scaled Agile Framework (SAFe) training this is a technique that will set the project up for success. Continuing progress by continuing short sprints, with quarterly planning and regular reporting to stakeholders.

Continue to support the committed and dedicated staff across the FEC

  • During the study, we encountered a multitude of fantastic public servants who are dedicated to transparency and ensuring that anyone can run for office.