Fulcrum was built to be extensible, enabling users to integrate best-in-class data collection tools with other applications and services for complete workflow customization. We provide a variety of integration points for getting data both into and out of Fulcrum and offer a full range of Professional Services if you need help getting things wired up.
Integrating with Google Drive
Drive has all the features one would expect of a modern file storage platform, including a generous free storage quota, secure sharing mechanisms, mobile apps for both Android & iOS, and desktop apps for integrated backup and sync. This, combined with Apps Script’s ability to set timed triggers and make external API requests makes it ideal for automating the import and export of Fulcrum data.
Scheduling Data Imports
In our first scenario, we will configure a system where CSV files can be saved to a folder on Drive and a script running on a timer will periodically check the folder for new files. If any new files have been added since the last run, the script will parse the CSV file, loop through the rows and import each row as a new record into a defined Fulcrum app.
We will also set it up so that after all the records have been imported, the CSV file will be moved from the Files to Import folder to the Imported Files folder and an email will be sent to notify an admin of the number of records imported. This workflow allows us to monitor the system and recover any CSV files for further inspection or troubleshooting. The timed trigger also has configurable notifications which will email the script owner if it fails to execute.
Note that Apps Script does come with some daily quotas and hard limitations we need to be aware of, specifically the 100MB / day total URL Fetch data received, 10MB / call URL Fetch POST size, and 6 min / execution script runtime limits. You may need a more robust solution for frequently importing large files with tens of thousands of records, but for seeding tens or hundreds of records, this should work well.
Set up the Drive folders
First up, we want to create the folders to hold the files in our Drive account. Assuming you have a Google account and are logged in, navigate your browser to https://drive.google.com and create a root folder to store everything. Give the folder a name and click Create. I named mine Fulcrum Imports.
Once the folder has been created, double click on it to enter the folder and follow the same process to create the 2 folders for storing the CSV files (Files to Import and Imported Files). If someone else is going to be uploading the files, you can give them access by right clicking the folder and selecting Share from the context menu. You may also want to install the Backup and Sync app on a computer so the CSV files can be programmatically moved or written to the shared drive rather than manually uploaded via the web.
Write the App Script
Now that the folders have been configured, we’re ready to start writing the logic to actually import the records. If your Drive is already connected to App Scripts, you should be able to create a new file within the root folder you created above.
You may have to click the Connect more apps button and search for “apps script” to connect it first. You can also create the file at https://script.google.com/home and just save it to your folder. It doesn’t matter where this file is created but keeping everything organized together is always a good idea.
Creating a new file will open a blank Apps Script code editor where you can begin to write your code. First rename the file from Untitled project to something more meaningful like Fulcrum Importer by clicking on the name in the top left corner. Now you can copy and paste the code from this gist into the code window.
The code should be fairly self-explanatory with notes to help explain what’s going on, but here’s how it works:
- Update the 4 variables at the top with your Fulcrum API token, the form ID for the app you are importing into, and the IDs of the Drive folders you are working with. You can get the Drive folder IDs from the browser URL when you navigate into the folder. The ID is at the end of the URL after folders/.
- In the importData function you need to build the record JSON that is expected by the Fulcrum API. You can find the element keys by viewing the form endpoint at /api/v2/forms/:id.json and see what Fulcrum expects by reviewing the JSON for an existing record at /api/v2/records/:id.json.
- UrlFetchApp is a special Apps Script class for making HTTP requests and receiving responses. We pass the stringified record JSON in the request payload, authenticate by passing the API token as a request header, and POST it to the Fulcrum records API.
- After every row has been POSTed, we move the file out of the Files to Import folder and into the Imported Files folder and call the sendEmail function with the record count. Be sure to update the email to value with a valid email address for verification. This process is repeated for every CSV file in the folder.
Once you’ve updated the code with your particular details, save the Apps Script file and give it a test by placing a CSV file with a couple of test rows in the import folder and manually triggering the script by clicking Run > Run function > getFiles. The first time you run this you will be prompted to authorize the script to view and manage the files in your Google Drive, send email as you, and connect to an external service via your Google account.
If everything runs successfully, you should receive an email at the address you provided and the rows in the CSV file should have been imported as records in your Fulcrum app. If you run into any errors, it’s likely an issue with the record JSON. You may want to try hardcoding this and running it again to help with debugging. Apps Script also has a Logger class for writing out text to the debugging logs.
Set up Timed Trigger
Once you’ve verified that the script is working as expected, you can set up a trigger so it will run automatically without any user input. Click Edit > Current project’s triggers and click the link to add a new trigger. You want to run the getFiles function with a Time-driven event and specify the timer type and time range. I’ve set the trigger to run the script daily between midnight and 1am so new records should be available for field crews in the morning.
Scheduling Data Exports
Now that we’ve walked through the basic process of wiring up Drive with Fulcrum for imports, we can easily expand upon this technique to schedule a daily data export of records from Fulcrum. The only addition we need for this is an exportData function to fetch the data via Fulcrum’s Query API and write it to a Drive folder.
Set up the Drive folders
Follow the same process above to create a project folder (Fulcrum Exports) with a folder inside it to store the files (Files). Note the ID of the Files folder for the script to write to below.
Write the App Script
Create a new App Script file in the project folder (Fulcrum Exporter) and copy and paste the code from this gist into the code window.
In this example we are writing a SQL query to fetch all the columns for all the records WHERE _server_updated_at >= NOW() – ‘1 day’::INTERVAL. This is using the _server_updated_at timestamp to filter out only the records that have been updated within the last 24 hours, which can be combined with a daily trigger to automatically run our daily export.
This has been a fairly technical post walking through the process of integrating Fulcrum with Google Drive for automating data imports and exports via the Fulcrum API. Hopefully it has provided some useful concepts and code samples for you to begin extending or automating your own data collection workflow, but our Professional Services team is always available to help you get the most out of your Fulcrum experience! We’d love to hear how Fulcrum fits into your broader workflow, so feel free to share your tips and tricks @fulcrumapp.