Microsoft Fabric
How to connect your RevOS DWH data (e.g. from HubSpot) to Microsoft Fabric
Integration with Microsoft Fabric is at the same time more simple to setup but also more complicated because of various options how you can bring your data to Microsoft Fabric OneLake. There are generally two options that from UX very close but work completely different - the Data Pipelines or Dataflows Gen2 (using Powerquery).
When choosing data pipelines you can use a copy job to copy data from RevOS Workspace to the OneLake, however we do not recommend you do that. Copy job could be very complicated to setup, simpler way is to use the dataflow instead. This tutorial will show you how you can do it quick and easy. Dataflows Gen2 is very powerful mechanics that can be used also for advanced topics like incremental refresh and historical data with slowly changing dimension type 2 (SCD Type 2).
Prerequisites
- Microsoft Fabric Workspace (trial version will also work)
Setup guide
Activate a PowerBI integration in RevOS
Select Integrations
and click on Enable
on PowerBI / Fabric in the RevOS as a source list:

After you enable it you’ll see credentials for your connectivity required for the next steps:

Such as (1) Service account e-mail, (2) Service account key and (3) Dataset name.
We don’t show you the Service Account JSON here, you can only copy it to buffer when clicking on the ‘copy’ icon next to it.
Create new Dataflow

Please use Gen 2 dataflow for advanced functionality and better future proof.
After you named your dataflow make sure the default destination is set to your Lakehouse/Warehouse and select the Get data from another source

Then enter the “big query” in the search dialog and select Google Big Query connector.

Then expand the Advanced Options and configure change following default options:
- Billing Project ID should be
revos-ai
- Use Storage Api should be
false
- Project ID should be
revos-ai
Like on the screenshot:

Then in the Connection credentials please select Service Account Login
(1) and use provided Service account email
(2) and Service account JSON file content
(3) content that you copied from the RevOS:

Then click on Next and then you will see the UI to chose tables, please expand revos-ai
project and see your data underneath.
Please note that you will only be able to see data there if you connected other systems to RevOS beforehand (e.g. your HubSpot, how to connect see here)

On this dialog you can pick and chose multiple tables you would like to import into your OneLake Lakehouse or Warehouse:

Don’t forget to add the Default Destination for your Dataflow Gen2 to and make sure Queries from the Dataflow are bound to the Destination:

after that you are ready to Publish the dataflow and run it first time. After the first refresh is successful you will be able to see your data in your Lakehouse/Warehouse:

and start building your semantical model:

(Optional) Schedule regular refresh
Data in your RevOS DWH will be updated according to the RevOS configuration, so it make sense to setup the automatic data refresh in the Fabric Lakehouse/Warehouse. For that purpose you could setup an automatic refresh on your dataflow. For that click on ‘…’ on the dataflow and select Settings:

there expand Refresh and configure the refresh schedule:

Frequently Asked Questions
I’m getting a Unable to authenticate with Google BigQuery Storage API
error
If you see following error:
OLE DB or ODBC error: [DataSource.Error] ERROR [HY000] [Microsoft][BigQuery] (131) Unable to authenticate with Google BigQuery Storage API.

reason for that is that PowerBI connector configured to use optional Storage API, however it can be disabled by setting an advanced connector configuration Use Storage Api
to false
or modify the M code that fetches tables from = GoogleBigQuery.Database()
to = GoogleBigQuery.Database([UseStorageApi=true])
Is my data secure when transferred between Fabric OneLake and RevOS?
Yes, the connectivity you use to fetch data uses a connector build and certified by Microsoft that uses encrypted channels using state of the art and constantly updated encryption mechanisms.
How can I configure the data refresh frequency in Fabric for HubSpot data?
Your data in RevOS is updated at least every 24 hours (or more frequent, according to your configuration and pricing plan). You can configure refresh interval on your dataflow.
What HubSpot data entities are supported in the Power BI integration?
See full list of entities in the documentation about the individual connectors, e.g.
Can I customize the HubSpot data before importing it into Power BI?
Depends on the data, if you import results of your scoring models they can be edited in RevOS. Data that is synced from other systems is exposed as-is to Dataflow. You can however use PowerQuery in Dataflow or DAX in Semantical Model to modify the data during and/or after import to Fabric.
How do I resolve connection errors during setup?
Connection errors can arise due to several issues, such as incorrect API key permissions or firewall settings blocking data transfer. Ensure that your Service Account E-Mail and JSON Keys are correctly configured and you can access RevOS platform. If issues persist, reach out to support at revos.ai and we will help you.
Last updated on May 30, 2025