Making complex queries across external databases run is why you need a Data Fabric
A virtual data warehouse enables a join of data across many data stores and networks or clouds for creating the views that the tools need. This is a step-change in the data platform and integration world where the old approach (build a data warehouse or data lake) means data has to be moved, re-structured, or transformed and ordered before any value can be created. In the traditional data lake or data warehouse approach (the centralized model), data quality issues are created by mistakes being made during the costly physical data integration project. Zetaris has solved these massive cost inefficiencies and technical barriers to analytics at scale. Zetaris changes everything! We don't move or duplicate data for analysis. We implement a virtual data warehouse.
Using the Zetaris Data Fabric, BI tools can be massively extended in terms of the breadth of the data that can be analyzed (moving from a limited set within the tool or data warehouse to (potentially) all the data across the internet that is relevant and available). Data in the stream can be instantly aggregated into dynamic views and combined with historic data in many different databases, files, etc. to form a data fabric to support a high-value view for your tools. And, by leveraging the Zetaris Data Fabric's mixed-streaming analytics capability (the ability to analyze both data in the stream with data in the warehouse or lake in real-time) the latency of the data is turned from intra-day (at best) to near real-time.
The video below shows how a Zetaris Data Fabric is connected with Tableau. In this video, a pre-built Virtual Data Warehouse deployed with the Zetaris Data Fabric technology is used by Tableau as it's data source.