On the day prior to the conference, we held multiple interactive sessions with industry specialists.
With continuous growth in the size of datasets and declining cost of storage, it’s becoming more common to use a data warehouse like Amazon Redshift for geospatial datasets. With AWS Data Exchange, you can find and subscribe to third-party data sets and seamlessly query them alongside your first party data in Amazon Redshift. In Amazon Redshift you can use spatial joins, built-in spatial functions, and machine learning models on your data directly in Amazon Redshift using SQL which analysts are already familiar with. By keeping the data in Amazon Redshift you can benefit from the best price-performance of any cloud data warehouse. With the geospatial capabilities enabled in Amazon Redshift, combined with the tools from CARTO, it’s possible to solve most spatial use cases natively in the cloud.
This practical workshop will teach geospatial professionals to leverage Snowflake and CARTO capabilities for location-based applications. Using a combination of Snowflake's built-in functions and CARTO Analytics Toolbox, participants will answer business questions and explore real-life use cases. This session is suitable for geospatial practitioners at all levels, whether they are beginners or advanced developers seeking to explore Snowflake's latest features.
Mobility data alone doesn’t provide much value unless it’s analysed and transformed into actionable insights. Technical difficulties for data cleaning and processing, and the expertise needed to extract customer behaviour from mobility data can be a real pain - but once mastered it is a game changer for businesses. In this hands-on workshop attendees will work on a real-world use case and learn about the necessary tools to drive business decisions. During the workshop, attendees will download Echo's mobility data and will work on Python, Geopandas and a Jupyter notebook to successfully master their analysis of mobility insights.
In this hands on session, you will learn how to use the scalable and powerful data processing capabilities of the Databricks platform and Mosaic package to ingest raster data layers and convert pixel based data to h3 cells with measurements as well as ingest vector data layers and tessellate the geometries to h3 cells with measurements. Once all the layers are represented in H3 you will be able to use the grid to generate features to be used by ML/AI models. Finally, you will use the Databricks AutoML solution to build a flood risk model with no-code modeling.
Generating geodemographic profiles for locations of interest using open data (Census 2021 output area variables), cloud geospatial processing and AutoML capabilities. Learn how to use new features of Snowflake and Google BigQuery (£0 for personal and small project use) to create sophisticated geospatial analytical outputs, including:
* the results: cluster analysis of the given geographic areas (e.g. store catchment) using census parameters;
* the processing pipelines in native SQL and Python code, which are parameterised, scalable and re-usable;
* the geocoded census feature dataset, to be made available through 'marketplace' capabilities;
A variation of the processing that incorporates the H3 geographic indexing capability also forms part of the work. The routines require little more than basic SQL knowledge, but some familiarity with Python would also be helpful.The workshop will be presented on screen, but the data and instructions will be provided for participants to follow along in their own personal GCP or Snowflake environment. It is recommended that participants sign-up for either GCP (for BigQuery) or Snowflake ahead of the workshop and familiarise themselves with the basic capabilities.
Starting is often the hardest part. Google Earth Engine (GEE) is capable of planetary geospatial scale analysis in lightning speed with a focus on Earth Observation data. Interactive and queryable GEE applications have been developed including green roofs identification, illegal forest harvesting, air pollution monitoring and much more. Alongside this, active GEE users have created a community catalogue holding data not hosted by Google. But where to being with such a powerful tool? This workshop focuses on the fundamental components of Earth Engine. It touches upon its structure and then swiftly moves to loading, querying, manipulating, and exporting data. Participants will leave with a foundational understanding of the tool, ready for their next planetary scale project.
In this session, Simon & Antonio will guide you on how to build an urban liveability index using spatial data and analysis. From raw data to clear insights and findings using SQL, CARTO Workflows and data visualization.
Join us to get hands-on-experience on how to explore the impacts of climate change on insurable assets with CARTO’s Analytics Toolbox. You'll get to work with real-world data, run an end-to-end advanced analysis to quantify the increased risk and vulnerability of assets to climate-related hazards, and learn how to create visualizations that can reveal hidden patterns and valuable insights.