Chapter 2. PostGIS Installation

Chapter 2. PostGIS Installation

This chapter details the steps required to install PostGIS.

2.1. Short Version

To compile assuming you have all the dependencies in your search path:

tar xvfz postgis-3.0.4.tar.gz
cd postgis-3.0.4
./configure
make
make install

Once postgis is installed, it needs to be enabled in each individual database you want to use it in.

[Note]

Using the extension enable process is preferred and more user-friendly. To spatially enable your database:

psql -d yourdatabase -c "CREATE EXTENSION postgis;"

-- if you built with raster support and want to install it --
psql -d yourdatabase -c "CREATE EXTENSION postgis_raster;"

-- if you want to install topology support --
psql -d yourdatabase -c "CREATE EXTENSION postgis_topology;"

-- if you built with sfcgal support and want to install it --
psql -d yourdatabase -c "CREATE EXTENSION postgis_sfcgal;"

-- if you want to install tiger geocoder --
psql -d yourdatabase -c "CREATE EXTENSION fuzzystrmatch"
psql -d yourdatabase -c "CREATE EXTENSION postgis_tiger_geocoder;"

-- if you installed with pcre
-- you should have address standardizer extension as well
psql -d yourdatabase -c "CREATE EXTENSION address_standardizer;"

Please refer to Section 2.5.3, “Building PostGIS Extensions and Deploying them” for more details about querying installed/available extensions and upgrading extensions, or switching from a non-extension install to an extension install.

For those who decided for some reason to use a non-extension based install, here are longer more painful instructions for you:

All the .sql files once installed will be installed in share/contrib/postgis-3.0 folder of your PostgreSQL install

createdb yourdatabase
createlang plpgsql yourdatabase
psql -d yourdatabase -f postgis.sql
psql -d yourdatabase -f postgis_comments.sql
psql -d yourdatabase -f spatial_ref_sys.sql

-- if you want to enable topology
psql -d yourdatabase -f topology.sql
psql -d yourdatabase -f topology_comments.sql

-- if you want to enable raster
-- and only if you compiled with raster (GDAL)
psql -d yourdatabase -f rtpostgis.sql
psql -d yourdatabase -f raster_comments.sql

-- if you want to enable sfcgal backend
-- and only if you built with sfcgal support --
psql -d yourdatabase -f sfcgal.sql
psql -d yourdatabase -f sfcgal_comments.sql

2.2. Configuring raster

If you enabled raster support you may want to read below how to properly configure it.

As of PostGIS 2.1.3, out-of-db rasters and all raster drivers are disabled by default. In order to re-enable these, you need to set the following environment variables POSTGIS_GDAL_ENABLED_DRIVERS and POSTGIS_ENABLE_OUTDB_RASTERS in the server environment. For PostGIS 2.2, you can use the more cross-platform approach of setting the corresponding Section 8.22, “Grand Unified Custom Variables (GUCs)” .

If you want to enable offline raster:

POSTGIS_ENABLE_OUTDB_RASTERS=1

Any other setting or no setting at all will disable out of db rasters.

In order to enable all GDAL drivers available in your GDAL install, set this environment variable as follows

POSTGIS_GDAL_ENABLED_DRIVERS=ENABLE_ALL

If you want to only enable specific drivers, set your environment variable as follows:

POSTGIS_GDAL_ENABLED_DRIVERS="GTiff PNG JPEG GIF XYZ"
[Note]

If you are on windows, do not quote the driver list

Setting environment variables varies depending on OS. For PostgreSQL installed on Ubuntu or Debian via apt-postgresql, the preferred way is to edit /etc/postgresql/ 10 / main /environment where 10 refers to version of PostgreSQL and main refers to the cluster.

On windows, if you are running as a service, you can set via System variables which for Windows 7 you can get to by right-clicking on Computer->Properties Advanced System Settings or in explorer navigating to Control Panel\All Control Panel Items\System . Then clicking Advanced System Settings ->Advanced->Environment Variables and adding new system variables.

After you set the environment variables, you'll need to restart your PostgreSQL service for the changes to take effect.

2.3. Install Requirements

PostGIS has the following requirements for building and usage:

Required

  • PostgreSQL 9.5 - 13. A complete installation of PostgreSQL (including server headers) is required. PostgreSQL is available from http://www.postgresql.org .

    For a full PostgreSQL / PostGIS support matrix and PostGIS/GEOS support matrix refer to http://trac.osgeo.org/postgis/wiki/UsersWikiPostgreSQLPostGIS

  • GNU C compiler ( gcc ). Some other ANSI C compilers can be used to compile PostGIS, but we find far fewer problems when compiling with gcc .

  • GNU Make ( gmake or make ). For many systems, GNU make is the default version of make. Check the version by invoking make -v . Other versions of make may not process the PostGIS Makefile properly.

  • Proj4 reprojection library, version 4.6.0 or greater. Proj4 4.9 or above is needed to take advantage of improved geodetic. The Proj4 library is used to provide coordinate reprojection support within PostGIS. Proj4 is available for download from http://trac.osgeo.org/proj/ .

  • GEOS geometry library, version 3.6 or greater, but GEOS 3.8+ is recommended to take full advantage of all the new functions and features. GEOS is available for download from http://trac.osgeo.org/geos/ .

  • LibXML2, version 2.5.x or higher. LibXML2 is currently used in some imports functions (ST_GeomFromGML and ST_GeomFromKML). LibXML2 is available for download from http://xmlsoft.org/downloads.html .

  • JSON-C, version 0.9 or higher. JSON-C is currently used to import GeoJSON via the function ST_GeomFromGeoJson. JSON-C is available for download from https://github.com/json-c/json-c/releases/ .

  • GDAL, version 1.8 or higher (1.9 or higher is strongly recommended since some things will not work well or behavior differently with lower versions). This is required for raster support. http://trac.osgeo.org/gdal/wiki/DownloadSource .

  • If compiling with PostgreSQL+JIT, LLVM version >=6 is required https://trac.osgeo.org/postgis/ticket/4125 .

Optional

  • GDAL (pseudo optional) only if you don't want raster you can leave it out. Also make sure to enable the drivers you want to use as described in Section 2.2, “Configuring raster” .

  • GTK (requires GTK+2.0, 2.8+) to compile the shp2pgsql-gui shape file loader. http://www.gtk.org/ .

  • SFCGAL, version 1.1 (or higher) could be used to provide additional 2D and 3D advanced analysis functions to PostGIS cf Section 8.19, “SFCGAL Functions” . And also allow to use SFCGAL rather than GEOS for some 2D functions provided by both backends (like ST_Intersection or ST_Area, for instance). A PostgreSQL configuration variable postgis.backend allow end user to control which backend he want to use if SFCGAL is installed (GEOS by default). Nota: SFCGAL 1.2 require at least CGAL 4.3 and Boost 1.54 (cf: http://oslandia.github.io/SFCGAL/installation.html ) https://github.com/Oslandia/SFCGAL .

  • In order to build the Chapter 12, Address Standardizer you will also need PCRE http://www.pcre.org (which generally is already installed on nix systems). Regex::Assemble perl CPAN package is only needed if you want to rebuild the data encoded in parseaddress-stcities.h . Chapter 12, Address Standardizer will automatically be built if it detects a PCRE library, or you pass in a valid --with-pcre-dir=/path/to/pcre during configure.

  • To enable ST_AsMVT protobuf-c library (for usage) and the protoc-c compiler (for building) are required. Also, pkg-config is required to verify the correct minimum version of protobuf-c. See protobuf-c . By default, Postgis will use Wagyu to validate MVT polygons faster which requires a c++11 compiler. It will use CXXFLAGS and the same compiler as the PostgreSQL installation. To disable this and use GEOS instead use the --without-wagyu during the configure step.

  • CUnit ( CUnit ). This is needed for regression testing. http://cunit.sourceforge.net/

  • DocBook ( xsltproc ) is required for building the documentation. Docbook is available from http://www.docbook.org/ .

  • DBLatex ( dblatex ) is required for building the documentation in PDF format. DBLatex is available from http://dblatex.sourceforge.net/ .

  • ImageMagick ( convert ) is required to generate the images used in the documentation. ImageMagick is available from http://www.imagemagick.org/ .

2.4. Getting the Source

Retrieve the PostGIS source archive from the downloads website http://download.osgeo.org/postgis/source/postgis-3.0.4.tar.gz

wget http://download.osgeo.org/postgis/source/postgis-3.0.4.tar.gz
tar -xvzf postgis-3.0.4.tar.gz

This will create a directory called postgis-3.0.4 in the current working directory.

Alternatively, checkout the source from the git repository https://git.osgeo.org/gitea/postgis/postgis/ .

git clone -b stable-3.0 https://git.osgeo.org/gitea/postgis/postgis.git postgis-3.0

PostGIS also offers several git mirrors .

Change into the newly created postgis-3.0 directory to continue the installation.

2.5. Compiling and Install from Source: Detailed

[Note]

Many OS systems now include pre-built packages for PostgreSQL/PostGIS. In many cases compilation is only necessary if you want the most bleeding edge versions or you are a package maintainer.

This section includes general compilation instructions, if you are compiling for Windows etc or another OS, you may find additional more detailed help at PostGIS User contributed compile guides and PostGIS Dev Wiki .

Pre-Built Packages for various OS are listed in PostGIS Pre-built Packages

If you are a windows user, you can get stable builds via Stackbuilder or PostGIS Windows download site We also have very bleeding-edge windows experimental builds that are built usually once or twice a week or whenever anything exciting happens. You can use these to experiment with the in progress releases of PostGIS

The PostGIS module is an extension to the PostgreSQL backend server. As such, PostGIS 3.0.4 requires full PostgreSQL server headers access in order to compile. It can be built against PostgreSQL versions 9.5 or higher. Earlier versions of PostgreSQL are not supported.

Refer to the PostgreSQL installation guides if you haven't already installed PostgreSQL. http://www.postgresql.org .

[Note]

For GEOS functionality, when you install PostgresSQL you may need to explicitly link PostgreSQL against the standard C++ library:

LDFLAGS=-lstdc++ ./configure [YOUR OPTIONS HERE]

This is a workaround for bogus C++ exceptions interaction with older development tools. If you experience weird problems (backend unexpectedly closed or similar things) try this trick. This will require recompiling your PostgreSQL from scratch, of course.

The following steps outline the configuration and compilation of the PostGIS source. They are written for Linux users and will not work on Windows or Mac.

2.5.1. Configuration

As with most linux installations, the first step is to generate the Makefile that will be used to build the source code. This is done by running the shell script

./configure

With no additional parameters, this command will attempt to automatically locate the required components and libraries needed to build the PostGIS source code on your system. Although this is the most common usage of ./configure , the script accepts several parameters for those who have the required libraries and programs in non-standard locations.

The following list shows only the most commonly used parameters. For a complete list, use the --help or --help=short parameters.

--with-library-minor-version

Starting with PostGIS 3.0, the library files generated by default will no longer have the minor version as part of the file name. This means all PostGIS 3 libs will end in postgis-3 . This was done to make pg_upgrade easier, with downside that you can only install one version PostGIS 3 series in your server. To get the old behavior of file including the minor version: e.g. postgis-3.0 add this switch to your configure statement.

--prefix=PREFIX

This is the location the PostGIS loader executables and shared libs will be installed. By default, this location is the same as the detected PostgreSQL installation.

[Caution]

This parameter is currently broken, as the package will only install into the PostgreSQL installation directory. Visit http://trac.osgeo.org/postgis/ticket/635 to track this bug.

--with-pgconfig=FILE

PostgreSQL provides a utility called pg_config to enable extensions like PostGIS to locate the PostgreSQL installation directory. Use this parameter ( --with-pgconfig=/path/to/pg_config ) to manually specify a particular PostgreSQL installation that PostGIS will build against.

--with-gdalconfig=FILE

GDAL, a required library, provides functionality needed for raster support gdal-config to enable software installations to locate the GDAL installation directory. Use this parameter ( --with-gdalconfig=/path/to/gdal-config ) to manually specify a particular GDAL installation that PostGIS will build against.

--with-geosconfig=FILE

GEOS, a required geometry library, provides a utility called geos-config to enable software installations to locate the GEOS installation directory. Use this parameter ( --with-geosconfig=/path/to/geos-config ) to manually specify a particular GEOS installation that PostGIS will build against.

--with-xml2config=FILE

LibXML is the library required for doing GeomFromKML/GML processes. It normally is found if you have libxml installed, but if not or you want a specific version used, you'll need to point PostGIS at a specific xml2-config confi file to enable software installations to locate the LibXML installation directory. Use this parameter ( >--with-xml2config=/path/to/xml2-config ) to manually specify a particular LibXML installation that PostGIS will build against.

--with-projdir=DIR

Proj4 is a reprojection library required by PostGIS. Use this parameter ( --with-projdir=/path/to/projdir ) to manually specify a particular Proj4 installation directory that PostGIS will build against.

--with-libiconv=DIR

Directory where iconv is installed.

--with-jsondir=DIR

JSON-C is an MIT-licensed JSON library required by PostGIS ST_GeomFromJSON support. Use this parameter ( --with-jsondir=/path/to/jsondir ) to manually specify a particular JSON-C installation directory that PostGIS will build against.

--with-pcredir=DIR

PCRE is an BSD-licensed Perl Compatible Regular Expression library required by address_standardizer extension. Use this parameter ( --with-pcredir=/path/to/pcredir ) to manually specify a particular PCRE installation directory that PostGIS will build against.

--with-gui

Compile the data import GUI (requires GTK+2.0). This will create shp2pgsql-gui graphical interface to shp2pgsql.

--without-raster

Compile without raster support.

--without-topology

Disable topology support. There is no corresponding library as all logic needed for topology is in postgis-3.0.4 library.

--with-gettext=no

By default PostGIS will try to detect gettext support and compile with it, however if you run into incompatibility issues that cause breakage of loader, you can disable it entirely with this command. Refer to ticket http://trac.osgeo.org/postgis/ticket/748 for an example issue solved by configuring with this. NOTE: that you aren't missing much by turning this off. This is used for international help/label support for the GUI loader which is not yet documented and still experimental.

--with-sfcgal=PATH

By default PostGIS will not install with sfcgal support without this switch. PATH is an optional argument that allows to specify an alternate PATH to sfcgal-config.

--without-wagyu

When building with MVT support, Postgis will use Wagyu to clip and validate MVT polygons. Wagyu is the fastest alternative and guarantees producing correct values for this specific case, but it requires a C++-11 compiler. With this optional argument you can disable using this library; GEOS will be used instead.

[Note]

If you obtained PostGIS from the code repository , the first step is really to run the script

./autogen.sh

This script will generate the configure script that in turn is used to customize the installation of PostGIS.

If you instead obtained PostGIS as a tarball, running ./autogen.sh is not necessary as configure has already been generated.

2.5.2. Building

Once the Makefile has been generated, building PostGIS is as simple as running

make

The last line of the output should be " PostGIS was built successfully. Ready to install. "

As of PostGIS v1.4.0, all the functions have comments generated from the documentation. If you wish to install these comments into your spatial databases later, run the command which requires docbook. The postgis_comments.sql and other package comments files raster_comments.sql, topology_comments.sql are also packaged in the tar.gz distribution in the doc folder so no need to make comments if installing from the tar ball. Comments are also included as part of the CREATE EXTENSION install.

make comments

Introduced in PostGIS 2.0. This generates html cheat sheets suitable for quick reference or for student handouts. This requires xsltproc to build and will generate 4 files in doc folder topology_cheatsheet.html , tiger_geocoder_cheatsheet.html , raster_cheatsheet.html , postgis_cheatsheet.html

You can download some pre-built ones available in html and pdf from PostGIS / PostgreSQL Study Guides

make cheatsheets

2.5.3. Building PostGIS Extensions and Deploying them

The PostGIS extensions are built and installed automatically if you are using PostgreSQL 9.1+.

If you are building from source repository, you need to build the function descriptions first. These get built if you have docbook installed. You can also manually build with the statement:

make comments

Building the comments is not necessary if you are building from a release tar ball since these are packaged pre-built with the tar ball already.

The extensions should automatically build as part of the make install process. You can if needed build from the extensions folders or copy files if you need them on a different server.

cd extensions
cd postgis
make clean
make
export PGUSER=postgres #overwrite psql variables
make check #to test before install
make install
# to test extensions
make check RUNTESTFLAGS=--extension
[Note]

make check uses psql to run tests and as such can use psql environment variables. Common ones useful to override are PGUSER , PGPORT , and PGHOST . Refer to psql environment variables

The extension files will always be the same for the same version of PostGIS and PostgreSQL regardless of OS, so it is fine to copy over the extension files from one OS to another as long as you have the PostGIS binaries already installed on your servers.

If you want to install the extensions manually on a separate server different from your development, You need to copy the following files from the extensions folder into the PostgreSQL / share / extension folder of your PostgreSQL install as well as the needed binaries for regular PostGIS if you don't have them already on the server.

  • These are the control files that denote information such as the version of the extension to install if not specified. postgis.control, postgis_topology.control .

  • All the files in the /sql folder of each extension. Note that these need to be copied to the root of the PostgreSQL share/extension folder extensions/postgis/sql/*.sql , extensions/postgis_topology/sql/*.sql

Once you do that, you should see postgis , postgis_topology as available extensions in PgAdmin -> extensions.

If you are using psql, you can verify that the extensions are installed by running this query:

SELECT name, default_version,installed_version
FROM pg_available_extensions WHERE name LIKE 'postgis%' or name LIKE 'address%';

             name             | default_version | installed_version
------------------------------+-----------------+-------------------
 address_standardizer         | 3.0.4         | 3.0.4
 address_standardizer_data_us | 3.0.4         | 3.0.4
 postgis                      | 3.0.4         | 3.0.4
 postgis_raster               | 3.0.4         | 3.0.4
 postgis_sfcgal               | 3.0.4         |
 postgis_tiger_geocoder       | 3.0.4         | 3.0.4
 postgis_topology             | 3.0.4         |
(6 rows)

If you have the extension installed in the database you are querying, you'll see mention in the installed_version column. If you get no records back, it means you don't have postgis extensions installed on the server at all. PgAdmin III 1.14+ will also provide this information in the extensions section of the database browser tree and will even allow upgrade or uninstall by right-clicking.

If you have the extensions available, you can install postgis extension in your database of choice by either using pgAdmin extension interface or running these sql commands:

CREATE EXTENSION postgis;
CREATE EXTENSION postgis_raster;
CREATE EXTENSION postgis_sfcgal;
CREATE EXTENSION fuzzystrmatch; --needed for postgis_tiger_geocoder
--optional used by postgis_tiger_geocoder, or can be used standalone
CREATE EXTENSION address_standardizer;
CREATE EXTENSION address_standardizer_data_us;
CREATE EXTENSION postgis_tiger_geocoder;
CREATE EXTENSION postgis_topology;

In psql you can use to see what versions you have installed and also what schema they are installed.

\connect mygisdb
\x
\dx postgis*
List of installed extensions
-[ RECORD 1 ]-------------------------------------------------
Name        | postgis
Version     | 3.0.4
Schema      | public
Description | PostGIS geometry, geography, and raster spat..
-[ RECORD 2 ]-------------------------------------------------
Name        | postgis_raster
Version     | 3.0.0dev
Schema      | public
Description | PostGIS raster types and functions
-[ RECORD 3 ]-------------------------------------------------
Name        | postgis_tiger_geocoder
Version     | 3.0.4
Schema      | tiger
Description | PostGIS tiger geocoder and reverse geocoder
-[ RECORD 4 ]-------------------------------------------------
Name        | postgis_topology
Version     | 3.0.4
Schema      | topology
Description | PostGIS topology spatial types and functions
[Warning]

Extension tables spatial_ref_sys , layer , topology can not be explicitly backed up. They can only be backed up when the respective postgis or postgis_topology extension is backed up, which only seems to happen when you backup the whole database. As of PostGIS 2.0.1, only srid records not packaged with PostGIS are backed up when the database is backed up so don't go around changing srids we package and expect your changes to be there. Put in a ticket if you find an issue. The structures of extension tables are never backed up since they are created with CREATE EXTENSION and assumed to be the same for a given version of an extension. These behaviors are built into the current PostgreSQL extension model, so nothing we can do about it.

If you installed 3.0.4, without using our wonderful extension system, you can change it to be extension based by running the below commands to package the functions in their respective extension.

CREATE EXTENSION postgis FROM unpackaged;
CREATE EXTENSION postgis_raster FROM unpackaged;
CREATE EXTENSION postgis_topology FROM unpackaged;
CREATE EXTENSION postgis_tiger_geocoder FROM unpackaged;

2.5.4. Testing

If you wish to test the PostGIS build, run

make check

The above command will run through various checks and regression tests using the generated library against an actual PostgreSQL database.

[Note]

If you configured PostGIS using non-standard PostgreSQL, GEOS, or Proj4 locations, you may need to add their library locations to the LD_LIBRARY_PATH environment variable.

[Caution]

Currently, the make check relies on the PATH and PGPORT environment variables when performing the checks - it does not use the PostgreSQL version that may have been specified using the configuration parameter --with-pgconfig . So make sure to modify your PATH to match the detected PostgreSQL installation during configuration or be prepared to deal with the impending headaches.

If successful, the output of the test should be similar to the following:


     CUnit - A unit testing framework for C - Version 2.1-3
     http://cunit.sourceforge.net/


Suite: algorithm
  Test: test_lw_segment_side ...passed
  Test: test_lw_segment_intersects ...passed
  Test: test_lwline_crossing_short_lines ...passed
  Test: test_lwline_crossing_long_lines ...passed
  Test: test_lwline_crossing_bugs ...passed
  Test: test_lwpoint_set_ordinate ...passed
  Test: test_lwpoint_get_ordinate ...passed
  Test: test_point_interpolate ...passed
  Test: test_lwline_interpolate_points ...passed
  Test: test_lwline_interpolate_point_3d ...passed
  Test: test_lwline_clip ...passed
  Test: test_lwpoly_clip ...passed
  Test: test_lwtriangle_clip ...passed
  Test: test_lwline_clip_big ...passed
  Test: test_lwmline_clip ...passed
  Test: test_geohash_point ...passed
  Test: test_geohash_precision ...passed
  Test: test_geohash ...passed
  Test: test_geohash_point_as_int ...passed
  Test: test_isclosed ...passed
  Test: test_lwgeom_simplify ...passed
  Test: test_lw_arc_center ...passed
  Test: test_point_density ...passed
  Test: test_kmeans ...passed
  Test: test_median_handles_3d_correctly ...passed
  Test: test_median_robustness ...passed
  Test: test_lwpoly_construct_circle ...passed
  Test: test_trim_bits ...passed
  Test: test_lwgeom_remove_repeated_points ...passed
Suite: buildarea
  Test: buildarea1 ...passed
  Test: buildarea2 ...passed
  Test: buildarea3 ...passed
  Test: buildarea4 ...passed
  Test: buildarea4b ...passed
  Test: buildarea5 ...passed
  Test: buildarea6 ...passed
  Test: buildarea7 ...passed
Suite: geometry_clean
  Test: test_lwgeom_make_valid ...passed
Suite: clip_by_rectangle
  Test: test_lwgeom_clip_by_rect ...DEBUG1: lwgeom_clip_by_rect: GEOS Error: IllegalArgumentException: Invalid number of points in LinearRing found 3 - must be 0 or >= 4
passed
Suite: force_sfs
  Test: test_sfs_11 ...passed
  Test: test_sfs_12 ...passed
  Test: test_sqlmm ...passed
Suite: geodetic
  Test: test_sphere_direction ...passed
  Test: test_sphere_project ...passed
  Test: test_lwgeom_area_sphere ...passed
  Test: test_gbox_from_spherical_coordinates ...passed
  Test: test_gserialized_get_gbox_geocentric ...passed
  Test: test_clairaut ...passed
  Test: test_edge_intersection ...passed
  Test: test_edge_intersects ...passed
  Test: test_edge_distance_to_point ...passed
  Test: test_edge_distance_to_edge ...passed
  Test: test_lwgeom_distance_sphere ...passed
  Test: test_lwgeom_check_geodetic ...passed
  Test: test_gserialized_from_lwgeom ...passed
  Test: test_spheroid_distance ...passed
  Test: test_spheroid_area ...passed
  Test: test_lwpoly_covers_point2d ...passed
  Test: test_gbox_utils ...passed
  Test: test_vector_angle ...passed
  Test: test_vector_rotate ...passed
  Test: test_lwgeom_segmentize_sphere ...passed
  Test: test_ptarray_contains_point_sphere ...passed
  Test: test_ptarray_contains_point_sphere_iowa ...passed
  Test: test_gbox_to_string_truncated ...passed
Suite: geos
  Test: test_geos_noop ...passed
  Test: test_geos_subdivide ...passed
  Test: test_geos_linemerge ...passed
  Test: test_geos_offsetcurve ...passed
  Test: test_geos_offsetcurve_crash ...passed
  Test: test_geos_makevalid ...passed
Suite: clustering
  Test: basic_test ...passed
  Test: nonsequential_test ...passed
  Test: basic_distance_test ...passed
  Test: single_input_test ...passed
  Test: empty_inputs_test ...passed
  Test: multipoint_test ...passed
  Test: dbscan_test ...passed
  Test: dbscan_test_3612a ...passed
  Test: dbscan_test_3612b ...passed
  Test: dbscan_test_3612c ...passed
Suite: clustering_unionfind
  Test: test_unionfind_create ...passed
  Test: test_unionfind_union ...passed
  Test: test_unionfind_ordered_by_cluster ...passed
  Test: test_unionfind_path_compression ...passed
  Test: test_unionfind_collapse_cluster_ids ...passed
Suite: homogenize
  Test: test_coll_point ...passed
  Test: test_coll_line ...passed
  Test: test_coll_poly ...passed
  Test: test_coll_coll ...passed
  Test: test_geom ...passed
  Test: test_coll_curve ...passed
Suite: encoded_polyline_input
  Test: in_encoded_polyline_test_geoms ...passed
  Test: in_encoded_polyline_test_precision ...passed
Suite: geojson_input
  Test: in_geojson_test_srid ...passed
  Test: in_geojson_test_bbox ...passed
  Test: in_geojson_test_geoms ...passed
Suite: iterator
  Test: test_point_count ...passed
  Test: test_ordering ...passed
  Test: test_modification ...passed
  Test: test_mixed_rw_access ...passed
  Test: test_cannot_modify_read_only ...passed
  Test: test_no_memory_leaked_when_iterator_is_partially_used ...passed
Suite: twkb_input
  Test: test_twkb_in_point ...passed
  Test: test_twkb_in_linestring ...passed
  Test: test_twkb_in_polygon ...passed
  Test: test_twkb_in_multipoint ...passed
  Test: test_twkb_in_multilinestring ...passed
  Test: test_twkb_in_multipolygon ...passed
  Test: test_twkb_in_collection ...passed
  Test: test_twkb_in_precision ...passed
Suite: serialization/deserialization
  Test: test_typmod_macros ...passed
  Test: test_flags_macros ...passed
  Test: test_serialized_srid ...NOTICE: SRID value -3005 converted to the officially unknown SRID value 0
passed
  Test: test_gserialized_from_lwgeom_size ...passed
  Test: test_gbox_serialized_size ...passed
  Test: test_lwgeom_from_gserialized ...passed
  Test: test_lwgeom_count_vertices ...passed
  Test: test_on_gser_lwgeom_count_vertices ...passed
  Test: test_geometry_type_from_string ...passed
  Test: test_lwcollection_extract ...passed
  Test: test_lwgeom_free ...passed
  Test: test_lwgeom_swap_ordinates ...passed
  Test: test_f2d ...passed
  Test: test_lwgeom_clone ...passed
  Test: test_lwgeom_force_clockwise ...passed
  Test: test_lwgeom_calculate_gbox ...passed
  Test: test_lwgeom_is_empty ...passed
  Test: test_lwgeom_same ...passed
  Test: test_lwline_from_lwmpoint ...passed
  Test: test_lwgeom_as_curve ...passed
  Test: test_lwgeom_scale ...passed
  Test: test_gserialized_is_empty ...passed
  Test: test_gserialized_peek_gbox_p_no_box_when_empty ...passed
  Test: test_gserialized_peek_gbox_p_gets_correct_box ...passed
  Test: test_gserialized_peek_gbox_p_fails_for_unsupported_cases ...passed
  Test: test_gbox_same_2d ...passed
  Test: test_signum_macro ...passed
Suite: lwstroke
  Test: test_lwcurve_linearize ...passed
  Test: test_unstroke ...passed
Suite: measures
  Test: test_mindistance2d_tolerance ...passed
  Test: test_mindistance3d_tolerance ...NOTICE: One or both of the geometries is missing z-value. The unknown z-value will be regarded as "any value"
NOTICE: One or both of the geometries is missing z-value. The unknown z-value will be regarded as "any value"
passed
  Test: test_rect_tree_contains_point ...passed
  Test: test_rect_tree_intersects_tree ...passed
  Test: test_lwgeom_segmentize2d ...NOTICE: ptarray.c:448 - ptarray_segmentize2d: Too many segments required (1.000000e+101)
NOTICE: liblwgeom code interrupted
NOTICE: liblwgeom code interrupted
NOTICE: liblwgeom code interrupted
NOTICE: liblwgeom code interrupted
passed
  Test: test_lwgeom_locate_along ...passed
  Test: test_lw_dist2d_pt_arc ...passed
  Test: test_lw_dist2d_seg_arc ...passed
  Test: test_lw_dist2d_arc_arc ...passed
  Test: test_lw_arc_length ...passed
  Test: test_lw_dist2d_pt_ptarrayarc ...passed
  Test: test_lw_dist2d_ptarray_ptarrayarc ...passed
  Test: test_lwgeom_tcpa ...passed
  Test: test_lwgeom_is_trajectory ...NOTICE: Geometry is not a LINESTRING
NOTICE: Line does not have M dimension
NOTICE: Measure of vertex 1 (1) not bigger than measure of vertex 0 (1)
NOTICE: Measure of vertex 1 (0) not bigger than measure of vertex 0 (1)
NOTICE: Measure of vertex 2 (2) not bigger than measure of vertex 1 (3)
passed
  Test: test_rect_tree_distance_tree ...passed
Suite: effectivearea
  Test: do_test_lwgeom_effectivearea_lines ...passed
  Test: do_test_lwgeom_effectivearea_polys ...passed
Suite: chaikin
  Test: do_test_chaikin_lines ...passed
  Test: do_test_chaikin_polygons ...passed
Suite: filterm
  Test: do_test_filterm_single_geometries ...passed
  Test: do_test_filterm_collections ...passed
Suite: minimum_bounding_circle
  Test: basic_test ...passed
  Test: test_empty ...passed
Suite: miscellaneous
  Test: test_misc_force_2d ...passed
  Test: test_misc_simplify ...passed
  Test: test_misc_count_vertices ...passed
  Test: test_misc_area ...passed
  Test: test_misc_wkb ...passed
  Test: test_grid ...passed
  Test: test_grid_in_place ...passed
  Test: test_clone ...passed
  Test: test_lwmpoint_from_lwgeom ...passed
Suite: noding
  Test: test_lwgeom_node ...passed
Suite: encoded_polyline_output
  Test: out_encoded_polyline_test_geoms ...passed
  Test: out_encoded_polyline_test_srid ...passed
  Test: out_encoded_polyline_test_precision ...passed
Suite: geojson_output
  Test: out_geojson_test_precision ...passed
  Test: out_geojson_test_dims ...passed
  Test: out_geojson_test_srid ...passed
  Test: out_geojson_test_bbox ...passed
  Test: out_geojson_test_geoms ...passed
Suite: gml_output
  Test: out_gml_test_precision ...passed
  Test: out_gml_test_srid ...passed
  Test: out_gml_test_dims ...passed
  Test: out_gml_test_geodetic ...passed
  Test: out_gml_test_geoms ...passed
  Test: out_gml_test_geoms_prefix ...passed
  Test: out_gml_test_geoms_nodims ...passed
  Test: out_gml2_extent ...passed
  Test: out_gml3_extent ...passed
Suite: kml_output
  Test: out_kml_test_precision ...passed
  Test: out_kml_test_dims ...passed
  Test: out_kml_test_geoms ...passed
  Test: out_kml_test_prefix ...passed
Suite: svg_output
  Test: out_svg_test_precision ...passed
  Test: out_svg_test_dims ...passed
  Test: out_svg_test_relative ...passed
  Test: out_svg_test_geoms ...passed
  Test: out_svg_test_srid ...passed
Suite: x3d_output
  Test: out_x3d3_test_precision ...passed
  Test: out_x3d3_test_geoms ...passed
  Test: out_x3d3_test_option ...passed
Suite: ptarray
  Test: test_ptarray_append_point ...passed
  Test: test_ptarray_append_ptarray ...passed
  Test: test_ptarray_locate_point ...passed
  Test: test_ptarray_isccw ...passed
  Test: test_ptarray_signed_area ...passed
  Test: test_ptarray_insert_point ...passed
  Test: test_ptarray_contains_point ...passed
  Test: test_ptarrayarc_contains_point ...passed
  Test: test_ptarray_scale ...passed
Suite: printing
  Test: test_lwprint_default_format ...passed
  Test: test_lwprint_format_orders ...passed
  Test: test_lwprint_optional_format ...passed
  Test: test_lwprint_oddball_formats ...passed
  Test: test_lwprint_bad_formats ...passed
Suite: sfcgal
  Test: test_sfcgal_noop ...passed
Suite: split
  Test: test_lwline_split_by_point_to ...passed
  Test: test_lwgeom_split ...passed
Suite: stringbuffer
  Test: test_stringbuffer_append ...passed
  Test: test_stringbuffer_aprintf ...passed
Suite: surface
  Test: triangle_parse ...passed
  Test: tin_parse ...passed
  Test: polyhedralsurface_parse ...passed
  Test: surface_dimension ...passed
Suite: spatial_trees
  Test: test_tree_circ_create ...passed
  Test: test_tree_circ_pip ...passed
  Test: test_tree_circ_pip2 ...passed
  Test: test_tree_circ_distance ...passed
  Test: test_tree_circ_distance_threshold ...passed
Suite: triangulate
  Test: test_lwgeom_delaunay_triangulation ...passed
  Test: test_lwgeom_voronoi_diagram ...passed
  Test: test_lwgeom_voronoi_diagram_expected_empty ...passed
  Test: test_lwgeom_voronoi_diagram_custom_envelope ...passed
Suite: twkb_output
  Test: test_twkb_out_point ...passed
  Test: test_twkb_out_linestring ...passed
  Test: test_twkb_out_polygon ...passed
  Test: test_twkb_out_multipoint ...passed
  Test: test_twkb_out_multilinestring ...passed
  Test: test_twkb_out_multipolygon ...passed
  Test: test_twkb_out_collection ...passed
  Test: test_twkb_out_idlist ...passed
Suite: varint
  Test: test_zigzag ...passed
  Test: test_varint ...passed
  Test: test_varint_roundtrip ...passed
Suite: wkb_input
  Test: test_wkb_in_point ...passed
  Test: test_wkb_in_linestring ...passed
  Test: test_wkb_in_polygon ...passed
  Test: test_wkb_in_multipoint ...passed
  Test: test_wkb_in_multilinestring ...passed
  Test: test_wkb_in_multipolygon ...passed
  Test: test_wkb_in_collection ...passed
  Test: test_wkb_in_circularstring ...passed
  Test: test_wkb_in_compoundcurve ...passed
  Test: test_wkb_in_curvpolygon ...passed
  Test: test_wkb_in_multicurve ...passed
  Test: test_wkb_in_multisurface ...passed
  Test: test_wkb_in_malformed ...passed
Suite: wkb_output
  Test: test_wkb_out_point ...passed
  Test: test_wkb_out_linestring ...passed
  Test: test_wkb_out_polygon ...passed
  Test: test_wkb_out_multipoint ...passed
  Test: test_wkb_out_multilinestring ...passed
  Test: test_wkb_out_multipolygon ...passed
  Test: test_wkb_out_collection ...passed
  Test: test_wkb_out_circularstring ...passed
  Test: test_wkb_out_compoundcurve ...passed
  Test: test_wkb_out_curvpolygon ...passed
  Test: test_wkb_out_multicurve ...passed
  Test: test_wkb_out_multisurface ...passed
  Test: test_wkb_out_polyhedralsurface ...passed
Suite: wkt_input
  Test: test_wkt_in_point ...passed
  Test: test_wkt_in_linestring ...passed
  Test: test_wkt_in_polygon ...passed
  Test: test_wkt_in_multipoint ...passed
  Test: test_wkt_in_multilinestring ...passed
  Test: test_wkt_in_multipolygon ...passed
  Test: test_wkt_in_collection ...passed
  Test: test_wkt_in_circularstring ...passed
  Test: test_wkt_in_compoundcurve ...passed
  Test: test_wkt_in_curvpolygon ...passed
  Test: test_wkt_in_multicurve ...passed
  Test: test_wkt_in_multisurface ...passed
  Test: test_wkt_in_tin ...passed
  Test: test_wkt_in_polyhedralsurface ...passed
  Test: test_wkt_in_errlocation ...passed
  Test: test_wkt_double ...passed
Suite: wkt_output
  Test: test_wkt_out_point ...passed
  Test: test_wkt_out_linestring ...passed
  Test: test_wkt_out_polygon ...passed
  Test: test_wkt_out_multipoint ...passed
  Test: test_wkt_out_multilinestring ...passed
  Test: test_wkt_out_multipolygon ...passed
  Test: test_wkt_out_collection ...passed
  Test: test_wkt_out_circularstring ...passed
  Test: test_wkt_out_compoundcurve ...passed
  Test: test_wkt_out_curvpolygon ...passed
  Test: test_wkt_out_multicurve ...passed
  Test: test_wkt_out_multisurface ...passed
Suite: wrapx
  Test: test_lwgeom_wrapx ...passed

Run Summary:    Type  Total    Ran Passed Failed Inactive
              suites     44     44    n/a      0        0
               tests    300    300    300      0        0
             asserts   4215   4215   4215      0      n/a
Elapsed time =    0.229 seconds

PostgreSQL 12devel on x86_64-w64-mingw32, compiled by gcc.exe (x86_64-posix-seh-rev0, Built by MinGW-W64 project) 8.1.0, 64-bit
  Postgis 3.0.0dev - r17081 - 2018-11-28 18:50:02
  scripts 3.0.0dev r17081
  GEOS: 3.7.0-CAPI-1.11.0 673b9939
  PROJ: Rel. 5.2.0, September 15th, 2018

Running tests

 ../loader/Point .............. ok
 ../loader/PointM .............. ok
 ../loader/PointZ .............. ok
 ../loader/MultiPoint .............. ok
 ../loader/MultiPointM .............. ok
 ../loader/MultiPointZ .............. ok
 ../loader/Arc .............. ok
 ../loader/ArcM .............. ok
 ../loader/ArcZ .............. ok
 ../loader/Polygon .............. ok
 ../loader/PolygonM .............. ok
 ../loader/PolygonZ .............. ok
 ../loader/TSTPolygon ......... ok
 ../loader/TSIPolygon ......... ok
 ../loader/TSTIPolygon ......... ok
 ../loader/PointWithSchema ..... ok
 ../loader/NoTransPoint ......... ok
 ../loader/NotReallyMultiPoint ......... ok
 ../loader/MultiToSinglePoint ......... ok
 ../loader/ReprojectPts ........ ok
 ../loader/ReprojectPtsGeog ........ ok
 ../loader/Latin1 .... ok
 ../loader/Latin1-implicit .... ok
 ../loader/mfile .... ok
 ../dumper/literalsrid ....... ok
 ../dumper/realtable ....... ok
 affine .. ok
 bestsrid .. ok
 binary .. ok
 boundary .. ok
 chaikin .. ok
 filterm .. ok
 cluster .. ok
 concave_hull .. ok
 concave_hull_hard .. ok
 ctors .. ok
 curvetoline .. ok
 dump .. ok
 dumppoints .. ok
 empty .. ok
 estimatedextent .. ok
 forcecurve .. ok
 geography .. ok
 geometric_median .. ok
 in_geohash .. ok
 in_gml .. ok
 in_kml .. ok
 in_encodedpolyline .. ok
 iscollection .. ok
 legacy .. ok
 long_xact .. ok
 lwgeom_regress .. ok
 measures .. ok
 minimum_bounding_circle .. ok
 normalize .. ok
 operators .. ok
 orientation .. ok
 out_geometry .. ok
 out_geography .. ok
 polygonize .. ok
 polyhedralsurface .. ok
 postgis_type_name .. ok
 quantize_coordinates .. ok
 regress .. ok
 regress_bdpoly .. ok
 regress_gist_index_nd .. ok
 regress_index .. ok
 regress_index_nulls .. ok
 regress_management .. ok
 regress_selectivity .. ok
 regress_lrs .. ok
 regress_ogc .. ok
 regress_ogc_cover .. ok
 regress_ogc_prep .. ok
 regress_proj .. ok
 relate .. ok
 remove_repeated_points .. ok
 removepoint .. ok
 reverse .. ok
 setpoint .. ok
 simplify .. ok
 simplifyvw .. ok
 size .. ok
 snaptogrid .. ok
 split .. ok
 sql-mm-serialize .. ok
 sql-mm-circularstring .. ok
 sql-mm-compoundcurve .. ok
 sql-mm-curvepoly .. ok
 sql-mm-general .. ok
 sql-mm-multicurve .. ok
 sql-mm-multisurface .. ok
 swapordinates .. ok
 summary .. ok
 temporal .. ok
 tickets .. ok
 twkb .. ok
 typmod .. ok
 wkb .. ok
 wkt .. ok
 wmsservers .. ok
 knn_recheck .. ok
 temporal_knn .. ok
 hausdorff .. ok
 regress_buffer_params .. ok
 frechet .. ok
 offsetcurve .. ok
 relatematch .. ok
 isvaliddetail .. ok
 sharedpaths .. ok
 snap .. ok
 node .. ok
 unaryunion .. ok
 clean .. ok
 relate_bnr .. ok
 delaunaytriangles .. ok
 clipbybox2d .. ok
 subdivide .. ok
 voronoi .. ok
 minimum_clearance .. ok
 oriented_envelope .. ok
 in_geojson .. ok
 regress_brin_index .. ok
 regress_brin_index_3d .. ok
 regress_brin_index_geography .. ok
 regress_spgist_index_2d .. ok
 regress_spgist_index_3d .. ok
 regress_spgist_index_nd .. ok
 mvt .. ok
 geobuf .. ok
 mvt_jsonb .. ok
 uninstall .. ok (4643)

Run tests: 134
Failed: 0


-- if you build with SFCGAL

PostgreSQL 12devel on x86_64-w64-mingw32, compiled by gcc.exe (x86_64-posix-seh-rev0, Built by MinGW-W64 project) 8.1.0, 64-bit
  Postgis 3.0.0dev - r17081 - 2018-11-28 18:50:02
  scripts 3.0.0dev r17081
  GEOS: 3.7.0-CAPI-1.11.0 673b9939
  PROJ: Rel. 5.2.0, September 15th, 2018
  SFCGAL: 1.3.2

Running tests

 regress_sfcgal .. ok
 empty .. ok
 geography .. ok
 legacy .. ok
 measures .. ok
 regress_ogc_prep .. ok
 regress_ogc .. ok
 regress .. ok
 tickets .. ok
 concave_hull .. ok
 wmsservers .. ok
 approximatemedialaxis .. ok
 uninstall .. ok (4643)

Run tests: 13
Failed: 0

-- if you built with raster support



     CUnit - A unit testing framework for C - Version 2.1-2
     http://cunit.sourceforge.net/


Suite: pixtype
  Test: test_pixtype_size ...passed
  Test: test_pixtype_alignment ...passed
  Test: test_pixtype_name ...passed
  Test: test_pixtype_index_from_name ...passed
  Test: test_pixtype_get_min_value ...passed
  Test: test_pixtype_compare_clamped_values ...passed
Suite: raster_basics
  Test: test_raster_new ...passed
  Test: test_raster_empty ...passed
  Test: test_raster_metadata ...passed
  Test: test_raster_clone ...passed
  Test: test_raster_from_band ...passed
  Test: test_raster_replace_band ...passed
Suite: band_basics
  Test: test_band_metadata ...passed
  Test: test_band_pixtype_1BB ...passed
  Test: test_band_pixtype_2BUI ...passed
  Test: test_band_pixtype_4BUI ...passed
  Test: test_band_pixtype_8BUI ...passed
  Test: test_band_pixtype_8BSI ...passed
  Test: test_band_pixtype_16BUI ...passed
  Test: test_band_pixtype_16BSI ...passed
  Test: test_band_pixtype_32BUI ...passed
  Test: test_band_pixtype_32BSI ...passed
  Test: test_band_pixtype_32BF ...passed
  Test: test_band_pixtype_64BF ...passed
  Test: test_band_get_pixel_line ...WARNING: Limiting returning number values to 1
WARNING: Attempting to get pixel values with out of range raster coordinates: (5, 5)
passed
  Test: test_band_new_offline_from_path ...passed
Suite: raster_wkb
  Test: test_raster_wkb ...SRID value -1 converted to the officially unknown SRID value 0
SRID value -1 converted to the officially unknown SRID value 0
SRID value -1 converted to the officially unknown SRID value 0
SRID value -1 converted to the officially unknown SRID value 0
SRID value -1 converted to the officially unknown SRID value 0
SRID value -1 converted to the officially unknown SRID value 0
passed
Suite: gdal
  Test: test_gdal_configured ...passed
  Test: test_gdal_drivers ...passed
  Test: test_gdal_rasterize ...passed
  Test: test_gdal_polygonize ...passed
  Test: test_raster_to_gdal ...Warning 6: PNG driver doesn't support data type Float64. Only eight bit (Byte) and sixteen bit (UInt16) bands supported. Defaulting to Byte

passed
  Test: test_gdal_to_raster ...passed
  Test: test_gdal_warp ...passed
Suite: raster_geometry
  Test: test_raster_envelope ...passed
  Test: test_raster_envelope_geom ...passed
  Test: test_raster_convex_hull ...passed
  Test: test_raster_surface ...INFO: Ring Self-intersection at or near point 2 -2
INFO: Ring Self-intersection at or near point 3 -3
passed
  Test: test_raster_perimeter ...passed
  Test: test_raster_pixel_as_polygon ...passed
Suite: raster_misc
  Test: test_raster_cell_to_geopoint ...passed
  Test: test_raster_geopoint_to_cell ...passed
  Test: test_raster_from_two_rasters ...passed
  Test: test_raster_compute_skewed_raster ...passed
Suite: band_stats
  Test: test_band_stats ...passed
  Test: test_band_value_count ...passed
Suite: band_misc
  Test: test_band_get_nearest_pixel ...passed
  Test: test_band_get_pixel_of_value ...passed
  Test: test_pixel_set_to_array ...passed
Suite: mapalgebra
  Test: test_raster_iterator ...passed
  Test: test_band_reclass ...passed
  Test: test_raster_colormap ...passed
Suite: spatial_relationship
  Test: test_raster_geos_overlaps ...passed
  Test: test_raster_geos_touches ...passed
  Test: test_raster_geos_contains ...passed
  Test: test_raster_geos_contains_properly ...passed
  Test: test_raster_geos_covers ...passed
  Test: test_raster_geos_covered_by ...passed
  Test: test_raster_within_distance ...passed
  Test: test_raster_fully_within_distance ...passed
  Test: test_raster_intersects ...passed
  Test: test_raster_same_alignment ...passed
Suite: misc
  Test: test_rgb_to_hsv ...passed
  Test: test_hsv_to_rgb ...passed
  Test: test_util_gdal_open ...passed

Run Summary:    Type  Total    Ran Passed Failed Inactive
              suites     12     12    n/a      0        0
               tests     65     65     65      0        0
             asserts  45896  45896  45896      0      n/a

Elapsed time =    0.499 seconds

Loading Raster into 'postgis_reg'
PostgreSQL 12devel on x86_64-w64-mingw32, compiled by gcc.exe (x86_64-posix-seh-rev0, Built by MinGW-W64 project) 8.1.0, 64-bit
  Postgis 3.0.0dev - r17081 - 2018-11-28 18:50:02
  scripts 3.0.0dev r17081
  raster scripts 3.0.0dev r17081
  GEOS: 3.7.0-CAPI-1.11.0 673b9939
  PROJ: Rel. 5.2.0, September 15th, 2018
  GDAL: GDAL 2.3.1, released 2018/06/22

Running tests

 check_gdal .. ok
 load_outdb ... ok
 check_raster_columns .. ok
 check_raster_overviews .. ok
 rt_io .. ok
 rt_bytea .. ok
 rt_wkb .. ok
 box3d .. ok
 rt_addband .. ok
 rt_band .. ok
 rt_tile .. ok
 rt_dimensions .. ok
 rt_scale .. ok
 rt_pixelsize .. ok
 rt_upperleft .. ok
 rt_rotation .. ok
 rt_georeference .. ok
 rt_set_properties .. ok
 rt_isempty .. ok
 rt_hasnoband .. ok
 rt_metadata .. ok
 rt_rastertoworldcoord .. ok
 rt_worldtorastercoord .. ok
 rt_convexhull .. ok
 rt_envelope .. ok
 rt_band_properties .. ok
 rt_set_band_properties .. ok
 rt_pixelaspolygons .. ok
 rt_pixelaspoints .. ok
 rt_pixelascentroids .. ok
 rt_setvalues_array .. ok
 rt_summarystats .. ok
 rt_count .. ok
 rt_histogram .. ok
 rt_quantile .. ok
 rt_valuecount .. ok
 rt_valuepercent .. ok
 rt_bandmetadata .. ok
 rt_pixelvalue .. ok
 rt_neighborhood .. ok
 rt_nearestvalue .. ok
 rt_pixelofvalue .. ok
 rt_polygon .. ok
 rt_setbandpath .. ok
 rt_utility .. ok
 rt_fromgdalraster .. ok
 rt_asgdalraster .. ok
 rt_astiff .. ok
 rt_asjpeg .. ok
 rt_aspng .. ok
 rt_reclass .. ok
 rt_gdalwarp .. ok
 rt_asraster .. ok
 rt_dumpvalues .. ok
 rt_makeemptycoverage .. ok
 rt_createoverview .. ok
 rt_mapalgebraexpr .. ok
 rt_mapalgebrafct .. ok
 rt_mapalgebraexpr_2raster .. ok
 rt_mapalgebrafct_2raster .. ok
 rt_mapalgebrafctngb .. ok
 rt_mapalgebrafctngb_userfunc .. ok
 rt_intersection .. ok
 rt_clip .. ok
 rt_mapalgebra .. ok
 rt_mapalgebra_expr .. ok
 rt_mapalgebra_mask .. ok
 rt_union .. ok
 rt_invdistweight4ma .. ok
 rt_4ma .. ok
 rt_setvalues_geomval .. ok
 rt_elevation_functions .. ok
 rt_colormap .. ok
 rt_grayscale .. ok
 rt_gist_relationships .. ok
 rt_intersects .. ok
 rt_samealignment .. ok
 rt_geos_relationships .. ok
 rt_iscoveragetile .. ok
 bug_test_car5 .. ok
 permitted_gdal_drivers .. ok
 tickets .. ok
 loader/Basic .. ok
 loader/Projected ... ok
 loader/BasicCopy .. ok
 loader/BasicFilename .. ok
 loader/BasicOutDB .. ok
 loader/Tiled10x10 .. ok
 loader/Tiled10x10Copy .. ok
 loader/Tiled8x8 .. ok
 clean .. ok
 uninstall .. ok (4643)

Run tests: 101
Failed: 0

-- topology regress
PostgreSQL 12devel on x86_64-w64-mingw32, compiled by gcc.exe (x86_64-posix-seh-rev0, Built by MinGW-W64 project) 8.1.0, 64-bit
  Postgis 3.0.0dev - r17081 - 2018-11-28 18:50:02
  scripts 3.0.0dev r17081
  GEOS: 3.7.0-CAPI-1.11.0 673b9939
  PROJ: Rel. 5.2.0, September 15th, 2018

Running tests

 regress/legacy_validate .. ok
 regress/legacy_predicate .. ok
 regress/legacy_invalid .. ok
 regress/sqlmm .. ok
 regress/legacy_query .. ok
 regress/addnode .. ok
 regress/addedge .. ok
 regress/addface .. ok
 regress/addface2.5d .. ok
 regress/addtopogeometrycolumn .. ok
 regress/polygonize .. ok
 regress/st_addisoedge .. ok
 regress/st_addisonode .. ok
 regress/st_addedgemodface .. ok
 regress/st_addedgenewfaces .. ok
 regress/st_changeedgegeom .. ok
 regress/st_createtopogeo .. ok
 regress/st_getfacegeometry .. ok
 regress/st_getfaceedges .. ok
 regress/st_modedgeheal .. ok
 regress/st_modedgesplit .. ok
 regress/st_newedgeheal .. ok
 regress/st_newedgessplit .. ok
 regress/st_remedgenewface .. ok
 regress/st_remedgemodface .. ok
 regress/st_simplify .. ok
 regress/topoelement .. ok
 regress/topoelementarray_agg .. ok
 regress/topogeo_addlinestring .. ok
 regress/topogeo_addpoint .. ok
 regress/topogeo_addpolygon .. ok
 regress/topogeom_edit .. ok
 regress/topogeometry_type .. ok
 regress/topojson .. ok
 regress/topologysummary .. ok
 regress/topo2.5d .. ok
 regress/totopogeom .. ok
 regress/droptopology .. ok
 regress/droptopogeometrycolumn .. ok
 regress/copytopology .. ok
 regress/createtopogeom .. ok
 regress/createtopology .. ok
 regress/gml .. ok
 regress/getnodebypoint .. ok
 regress/getedgebypoint .. ok
 regress/getfacebypoint .. ok
 regress/getringedges .. ok
 regress/gettopogeomelements .. ok
 regress/layertrigger .. ok
 regress/validatetopology .. ok
 uninstall .. ok (4643)

Run tests: 51
Failed: 0

-- if you built --with-gui, you should see this too

     CUnit - A unit testing framework for C - Version 2.1-2
     http://cunit.sourceforge.net/


Suite: Shapefile Loader File shp2pgsql Test
  Test: test_ShpLoaderCreate() ...passed
  Test: test_ShpLoaderDestroy() ...passed
Suite: Shapefile Loader File pgsql2shp Test
  Test: test_ShpDumperCreate() ...passed
  Test: test_ShpDumperDestroy() ...passed

Run Summary:    Type  Total    Ran Passed Failed Inactive
              suites      2      2    n/a      0        0
               tests      4      4      4      0        0
             asserts      4      4      4      0      n/a

The postgis_tiger_geocoder and address_standardizer extensions, currenlty only support the standard PostgreSQL installcheck. To test these use the below. Note: the make install is not necessary if you already did make install at root of PostGIS code folder.

For address_standardizer:

cd extensions/address_standardizer
make install
make installcheck
	  

Output should look like:

============== dropping database "contrib_regression" ==============
DROP DATABASE
============== creating database "contrib_regression" ==============
CREATE DATABASE
ALTER DATABASE
============== running regression test queries        ==============
test test-init-extensions     ... ok
test test-parseaddress        ... ok
test test-standardize_address_1 ... ok
test test-standardize_address_2 ... ok

=====================
 All 4 tests passed.
=====================

For tiger geocoder, make sure you have postgis and fuzzystrmatch extensions available in your PostgreSQL instance. The address_standardizer tests will also kick in if you built postgis with address_standardizer support:

cd extensions/postgis_tiger_geocoder
make install
make installcheck
	  

output should look like:

============== dropping database "contrib_regression" ==============
DROP DATABASE
============== creating database "contrib_regression" ==============
CREATE DATABASE
ALTER DATABASE
============== installing fuzzystrmatch               ==============
CREATE EXTENSION
============== installing postgis                     ==============
CREATE EXTENSION
============== installing postgis_tiger_geocoder      ==============
CREATE EXTENSION
============== installing address_standardizer        ==============
CREATE EXTENSION
============== running regression test queries        ==============
test test-normalize_address   ... ok
test test-pagc_normalize_address ... ok

=====================
All 2 tests passed.
=====================

2.5.5. Installation

To install PostGIS, type

make install

This will copy the PostGIS installation files into their appropriate subdirectory specified by the --prefix configuration parameter. In particular:

  • The loader and dumper binaries are installed in [prefix]/bin .

  • The SQL files, such as postgis.sql , are installed in [prefix]/share/contrib .

  • The PostGIS libraries are installed in [prefix]/lib .

If you previously ran the make comments command to generate the postgis_comments.sql , raster_comments.sql file, install the sql file by running

make comments-install

[Note]

postgis_comments.sql , raster_comments.sql , topology_comments.sql was separated from the typical build and installation targets since with it comes the extra dependency of xsltproc .

2.6. Creating a spatial database using EXTENSIONS

If you are using PostgreSQL 9.1+ and have compiled and installed the extensions/ postgis modules, you can create a spatial database the new way.

createdb [yourdatabase]

The core postgis extension installs PostGIS geometry, geography, spatial_ref_sys and all the functions and comments with a simple:

CREATE EXTENSION postgis;

command.

psql -d [yourdatabase] -c "CREATE EXTENSION postgis;"

Raster is packaged as a separate extension and installable with command:

psql -d [yourdatabase] -c "CREATE EXTENSION postgis_raster;"

Topology is packaged as a separate extension and installable with command:

psql -d [yourdatabase] -c "CREATE EXTENSION postgis_topology;"

If you plan to restore an old backup from prior versions in this new db, run:

psql -d [yourdatabase] -f legacy.sql

[Note]

If you need legacy functions, you'll need to reinstall the legacy.sql script whenever you upgrade the minor version of PostGIS. E.g. if you upgraded from 2.4.3 to 2.5.0, then you need to reinstall the legacy.sql packaged with 2.5.0. This is because some of the functions make reference to the library and the library is named with the minor in it.

You can later run uninstall_legacy.sql to get rid of the deprecated functions after you are done with restoring and cleanup.

2.7. Create a spatially-enabled database without using extensions

[Note]

This is generally only needed if you cannot or don't want to get PostGIS installed in the PostgreSQL extension directory (for example during testing, development or in a restricted environment).

The first step in creating a PostGIS database is to create a simple PostgreSQL database.

createdb [yourdatabase]

Many of the PostGIS functions are written in the PL/pgSQL procedural language. As such, the next step to create a PostGIS database is to enable the PL/pgSQL language in your new database. This is accomplish by the command below command. For PostgreSQL 8.4+, this is generally already installed

createlang plpgsql [yourdatabase]

Now load the PostGIS object and function definitions into your database by loading the postgis.sql definitions file (located in [prefix]/share/contrib as specified during the configuration step).

psql -d [yourdatabase] -f postgis.sql

For a complete set of EPSG coordinate system definition identifiers, you can also load the spatial_ref_sys.sql definitions file and populate the spatial_ref_sys table. This will permit you to perform ST_Transform() operations on geometries.

psql -d [yourdatabase] -f spatial_ref_sys.sql

If you wish to add comments to the PostGIS functions, the final step is to load the postgis_comments.sql into your spatial database. The comments can be viewed by simply typing \dd [function_name] from a psql terminal window.

psql -d [yourdatabase] -f postgis_comments.sql

Install raster support

psql -d [yourdatabase] -f rtpostgis.sql

Install raster support comments. This will provide quick help info for each raster function using psql or PgAdmin or any other PostgreSQL tool that can show function comments

psql -d [yourdatabase] -f raster_comments.sql

Install topology support

psql -d [yourdatabase] -f topology/topology.sql

Install topology support comments. This will provide quick help info for each topology function / type using psql or PgAdmin or any other PostgreSQL tool that can show function comments

psql -d [yourdatabase] -f topology/topology_comments.sql

If you plan to restore an old backup from prior versions in this new db, run:

psql -d [yourdatabase] -f legacy.sql

[Note]

There is an alternative legacy_minimal.sql you can run instead which will install barebones needed to recover tables and work with apps like MapServer and GeoServer. If you have views that use things like distance / length etc, you'll need the full blown legacy.sql

You can later run uninstall_legacy.sql to get rid of the deprecated functions after you are done with restoring and cleanup.

2.8. Installing and Using the address standardizer

The address_standardizer extension used to be a separate package that required separate download. From PostGIS 2.2 on, it is now bundled in. For more information about the address_standardize, what it does, and how to configure it for your needs, refer to Chapter 12, Address Standardizer .

This standardizer can be used in conjunction with the PostGIS packaged tiger geocoder extension as a replacement for the Normalize_Address discussed. To use as replacement refer to Section 2.9.3, “Using Address Standardizer Extension with Tiger geocoder” . You can also use it as a building block for your own geocoder or use it to standardize your addresses for easier compare of addresses.

The address standardizer relies on PCRE which is usually already installed on many Nix systems, but you can download the latest at: http://www.pcre.org . If during Section 2.5.1, “Configuration” , PCRE is found, then the address standardizer extension will automatically be built. If you have a custom pcre install you want to use instead, pass to configure --with-pcredir=/path/to/pcre where /path/to/pcre is the root folder for your pcre include and lib directories.

For Windows users, the PostGIS 2.1+ bundle is packaged with the address_standardizer already so no need to compile and can move straight to CREATE EXTENSION step.

Once you have installed, you can connect to your database and run the SQL:

CREATE EXTENSION address_standardizer;

The following test requires no rules, gaz, or lex tables

SELECT num, street, city, state, zip
 FROM parse_address('1 Devonshire Place PH301, Boston, MA 02109');

Output should be

 num |         street         |  city  | state |  zip
-----+------------------------+--------+-------+-------
 1   | Devonshire Place PH301 | Boston | MA    | 02109

2.8.1. Installing Regex::Assemble

Perl Regex:Assemble is no longer needed for compiling address_standardizer extension since the files it generates are part of the source tree. However if you need to edit the usps-st-city-orig.txt or usps-st-city-orig.txt usps-st-city-adds.tx , you need to rebuild parseaddress-stcities.h which does require Regex:Assemble.

cpan Regexp::Assemble

or if you are on Ubuntu / Debian you might need to do

sudo perl -MCPAN -e "install Regexp::Assemble"

2.9. Installing, Upgrading Tiger Geocoder and loading data

Extras like Tiger geocoder may not be packaged in your PostGIS distribution. If you are missing the tiger geocoder extension or want a newer version than what your install comes with, then use the share/extension/postgis_tiger_geocoder.* files from the packages in Windows Unreleased Versions section for your version of PostgreSQL. Although these packages are for windows, the postgis_tiger_geocoder extension files will work on any OS since the extension is an SQL/plpgsql only extension.

2.9.1. Tiger Geocoder Enabling your PostGIS database: Using Extension

If you are using PostgreSQL 9.1+ and PostGIS 2.1+, you can take advantage of the new extension model for installing tiger geocoder. To do so:

  1. First get binaries for PostGIS 2.1+ or compile and install as usual. This should install the necessary extension files as well for tiger geocoder.

  2. Connect to your database via psql or pgAdmin or some other tool and run the following SQL commands. Note that if you are installing in a database that already has postgis, you don't need to do the first step. If you have fuzzystrmatch extension already installed, you don't need to do the second step either.

    CREATE EXTENSION postgis;
    CREATE EXTENSION fuzzystrmatch;
    CREATE EXTENSION postgis_tiger_geocoder;
    --this one is optional if you want to use the rules based standardizer (pagc_normalize_address)
    CREATE EXTENSION address_standardizer;

    If you already have postgis_tiger_geocoder extension installed, and just want to update to the latest run:

    ALTER EXTENSION postgis UPDATE;
    ALTER EXTENSION postgis_tiger_geocoder UPDATE;

    If you made custom entries or changes to tiger.loader_platform and tiger.loader_variables you may need to update these.

  3. To confirm your install is working correctly, run this sql in your database:

    SELECT na.address, na.streetname,na.streettypeabbrev, na.zip
    	FROM normalize_address('1 Devonshire Place, Boston, MA 02109') AS na;

    Which should output

     address | streetname | streettypeabbrev |  zip
    ---------+------------+------------------+-------
    	   1 | Devonshire | Pl               | 02109
  4. Create a new record in tiger.loader_platform table with the paths of your executables and server.

    So for example to create a profile called debbie that follows sh convention. You would do:

    INSERT INTO tiger.loader_platform(os, declare_sect, pgbin, wget, unzip_command, psql, path_sep,
    		   loader, environ_set_command, county_process_command)
    SELECT 'debbie', declare_sect, pgbin, wget, unzip_command, psql, path_sep,
    	   loader, environ_set_command, county_process_command
      FROM tiger.loader_platform
      WHERE os = 'sh';

    And then edit the paths in the declare_sect column to those that fit Debbie's pg, unzip,shp2pgsql, psql, etc path locations.

    If you don't edit this loader_platform table, it will just contain common case locations of items and you'll have to edit the generated script after the script is generated.

  5. As of PostGIS 2.4.1 the Zip code-5 digit tabulation area zcta5 load step was revised to load current zcta5 data and is part of the Loader_Generate_Nation_Script when enabled. It is turned off by default because it takes quite a bit of time to load (20 to 60 minutes), takes up quite a bit of disk space, and is not used that often.

    To enable it, do the following:

    UPDATE tiger.loader_lookuptables SET load = true WHERE table_name = 'zcta510';

    If present the Geocode function can use it if a boundary filter is added to limit to just zips in that boundary. The Reverse_Geocode function uses it if the returned address is missing a zip, which often happens with highway reverse geocoding.

  6. Create a folder called gisdata on root of server or your local pc if you have a fast network connection to the server. This folder is where the tiger files will be downloaded to and processed. If you are not happy with having the folder on the root of the server, or simply want to change to a different folder for staging, then edit the field staging_fold in the tiger.loader_variables table.

  7. Create a folder called temp in the gisdata folder or whereever you designated the staging_fold to be. This will be the folder where the loader extracts the downloaded tiger data.

  8. Then run the Loader_Generate_Nation_Script SQL function make sure to use the name of your custom profile and copy the script to a .sh or .bat file. So for example to build the nation load:

    psql -c "SELECT Loader_Generate_Nation_Script('debbie')" -d geocoder -tA > /gisdata/nation_script_load.sh
  9. Run the generated nation load commandline scripts.

    cd /gisdata
    sh nation_script_load.sh
  10. After you are done running the nation script, you should have three tables in your tiger_data schema and they should be filled with data. Confirm you do by doing the following queries from psql or pgAdmin

    SELECT count(*) FROM tiger_data.county_all;
     count
    -------
      3233
    (1 row)
    SELECT count(*) FROM tiger_data.state_all;
     count
    -------
        56
    (1 row)
    
  11. By default the tables corresponding to bg , tract , tabblock are not loaded. These tables are not used by the geocoder but are used by folks for population statistics. If you wish to load them as part of your state loads, run the following statement to enable them.

    UPDATE tiger.loader_lookuptables SET load = true WHERE load = false AND lookup_name IN('tract', 'bg', 'tabblock');

    Alternatively you can load just these tables after loading state data using the Loader_Generate_Census_Script

  12. For each state you want to load data for, generate a state script Loader_Generate_Script .

    [Warning]

    DO NOT Generate the state script until you have already loaded the nation data, because the state script utilizes county list loaded by nation script.

  13. psql -c "SELECT Loader_Generate_Script(ARRAY['MA'], 'debbie')" -d geocoder -tA > /gisdata/ma_load.sh
  14. Run the generated commandline scripts.

    cd /gisdata
    sh ma_load.sh
  15. After you are done loading all data or at a stopping point, it's a good idea to analyze all the tiger tables to update the stats (include inherited stats)

    SELECT install_missing_indexes();
    vacuum (analyze, verbose) tiger.addr;
    vacuum (analyze, verbose) tiger.edges;
    vacuum (analyze, verbose) tiger.faces;
    vacuum (analyze, verbose) tiger.featnames;
    vacuum (analyze, verbose) tiger.place;
    vacuum (analyze, verbose) tiger.cousub;
    vacuum (analyze, verbose) tiger.county;
    vacuum (analyze, verbose) tiger.state;
    vacuum (analyze, verbose) tiger.zip_lookup_base;
    vacuum (analyze, verbose) tiger.zip_state;
    vacuum (analyze, verbose) tiger.zip_state_loc;

2.9.1.1. Converting a Tiger Geocoder Regular Install to Extension Model

If you installed the tiger geocoder without using the extension model, you can convert to the extension model as follows:

  1. Follow instructions in Section 2.9.5, “Upgrading your Tiger Geocoder Install” for the non-extension model upgrade.

  2. Connect to your database with psql or pgAdmin and run the following command:

    CREATE EXTENSION postgis_tiger_geocoder FROM unpackaged;

2.9.2. Tiger Geocoder Enabling your PostGIS database: Not Using Extensions

First install PostGIS using the prior instructions.

If you don't have an extras folder, download http://download.osgeo.org/postgis/source/postgis-3.0.4.tar.gz

tar xvfz postgis-3.0.4.tar.gz

cd postgis-3.0.4/extras/tiger_geocoder

Edit the tiger_loader_2015.sql (or latest loader file you find, unless you want to load different year) to the paths of your executables server etc or alternatively you can update the loader_platform table once installed. If you don't edit this file or the loader_platform table, it will just contain common case locations of items and you'll have to edit the generated script after the fact when you run the Loader_Generate_Nation_Script and Loader_Generate_Script SQL functions.

If you are installing Tiger geocoder for the first time edit either the create_geocode.bat script If you are on windows or the create_geocode.sh if you are on Linux/Unix/Mac OSX with your PostgreSQL specific settings and run the corresponding script from the commandline.

Verify that you now have a tiger schema in your database and that it is part of your database search_path. If it is not, add it with a command something along the line of:

ALTER DATABASE geocoder SET search_path=public, tiger;

The normalizing address functionality works more or less without any data except for tricky addresses. Run this test and verify things look like this:

SELECT pprint_addy(normalize_address('202 East Fremont Street, Las Vegas, Nevada 89101')) As pretty_address;
pretty_address
---------------------------------------
202 E Fremont St, Las Vegas, NV 89101
			

2.9.3. Using Address Standardizer Extension with Tiger geocoder

One of the many complaints of folks is the address normalizer function Normalize_Address function that normalizes an address for prepping before geocoding. The normalizer is far from perfect and trying to patch its imperfectness takes a vast amount of resources. As such we have integrated with another project that has a much better address standardizer engine. To use this new address_standardizer, you compile the extension as described in Section 2.8, “Installing and Using the address standardizer” and install as an extension in your database.

Once you install this extension in the same database as you have installed postgis_tiger_geocoder , then the Pagc_Normalize_Address can be used instead of Normalize_Address . This extension is tiger agnostic, so can be used with other data sources such as international addresses. The tiger geocoder extension does come packaged with its own custom versions of rules table ( tiger.pagc_rules ) , gaz table ( tiger.pagc_gaz ), and lex table ( tiger.pagc_lex ). These you can add and update to improve your standardizing experience for your own needs.

2.9.4. Loading Tiger Data

The instructions for loading data are available in a more detailed form in the extras/tiger_geocoder/tiger_2011/README . This just includes the general steps.

The load process downloads data from the census website for the respective nation files, states requested, extracts the files, and then loads each state into its own separate set of state tables. Each state table inherits from the tables defined in tiger schema so that its sufficient to just query those tables to access all the data and drop a set of state tables at any time using the Drop_State_Tables_Generate_Script if you need to reload a state or just don't need a state anymore.

In order to be able to load data you'll need the following tools:

  • A tool to unzip the zip files from census website.

    For Unix like systems: unzip executable which is usually already installed on most Unix like platforms.

    For Windows, 7-zip which is a free compress/uncompress tool you can download from http://www.7-zip.org/

  • shp2pgsql commandline which is installed by default when you install PostGIS.

  • wget which is a web grabber tool usually installed on most Unix/Linux systems.

    If you are on windows, you can get pre-compiled binaries from http://gnuwin32.sourceforge.net/packages/wget.htm

If you are upgrading from tiger_2010, you'll need to first generate and run Drop_Nation_Tables_Generate_Script . Before you load any state data, you need to load the nation wide data which you do with Loader_Generate_Nation_Script . Which will generate a loader script for you. Loader_Generate_Nation_Script is a one-time step that should be done for upgrading (from 2010) and for new installs.

To load state data refer to Loader_Generate_Script to generate a data load script for your platform for the states you desire. Note that you can install these piecemeal. You don't have to load all the states you want all at once. You can load them as you need them.

After the states you desire have been loaded, make sure to run the:

SELECT install_missing_indexes();

as described in Install_Missing_Indexes .

To test that things are working as they should, try to run a geocode on an address in your state using Geocode

2.9.5. Upgrading your Tiger Geocoder Install

If you have Tiger Geocoder packaged with 2.0+ already installed, you can upgrade the functions at any time even from an interim tar ball if there are fixes you badly need. This will only work for Tiger geocoder not installed with extensions.

If you don't have an extras folder, download http://download.osgeo.org/postgis/source/postgis-3.0.4.tar.gz

tar xvfz postgis-3.0.4.tar.gz

cd postgis-3.0.4/extras/tiger_geocoder/tiger_2011

Locate the upgrade_geocoder.bat script If you are on windows or the upgrade_geocoder.sh if you are on Linux/Unix/Mac OSX. Edit the file to have your postgis database credentials.

If you are upgrading from 2010 or 2011, make sure to unremark out the loader script line so you get the latest script for loading 2012 data.

Then run th corresponding script from the commandline.

Next drop all nation tables and load up the new ones. Generate a drop script with this SQL statement as detailed in Drop_Nation_Tables_Generate_Script

SELECT drop_nation_tables_generate_script();

Run the generated drop SQL statements.

Generate a nation load script with this SELECT statement as detailed in Loader_Generate_Nation_Script

For windows

SELECT loader_generate_nation_script('windows'); 

For unix/linux

SELECT loader_generate_nation_script('sh');

Refer to Section 2.9.4, “Loading Tiger Data” for instructions on how to run the generate script. This only needs to be done once.

[Note]

You can have a mix of 2010/2011 state tables and can upgrade each state separately. Before you upgrade a state to 2011, you first need to drop the 2010 tables for that state using Drop_State_Tables_Generate_Script .

2.10. Create a spatially-enabled database from a template

Some packaged distributions of PostGIS (in particular the Win32 installers for PostGIS >= 1.1.5) load the PostGIS functions into a template database called template_postgis . If the template_postgis database exists in your PostgreSQL installation then it is possible for users and/or applications to create spatially-enabled databases using a single command. Note that in both cases, the database user must have been granted the privilege to create new databases.

From the shell:

# createdb -T template_postgis my_spatial_db

From SQL:

postgres=# CREATE DATABASE my_spatial_db TEMPLATE=template_postgis

2.11. Upgrading

Upgrading existing spatial databases can be tricky as it requires replacement or introduction of new PostGIS object definitions.

Unfortunately not all definitions can be easily replaced in a live database, so sometimes your best bet is a dump/reload process.

PostGIS provides a SOFT UPGRADE procedure for minor or bugfix releases, and a HARD UPGRADE procedure for major releases.

Before attempting to upgrade PostGIS, it is always worth to backup your data. If you use the -Fc flag to pg_dump you will always be able to restore the dump with a HARD UPGRADE.

2.11.1. Soft upgrade

If you installed your database using extensions, you'll need to upgrade using the extension model as well. If you installed using the old sql script way, then you should upgrade using the sql script way. Please refer to the appropriate.

2.11.1.1. Soft Upgrade Pre 9.1+ or without extensions

This section applies only to those who installed PostGIS not using extensions. If you have extensions and try to upgrade with this approach you'll get messages like:

can't drop ... because postgis extension depends on it

NOTE: if you are moving from PostGIS 1.* to PostGIS 2.* or from PostGIS 2.* prior to r7409, you cannot use this procedure but would rather need to do a HARD UPGRADE .

After compiling and installing (make install) you should find a set of *_upgrade.sql files in the installation folders. You can list them all with:

		ls `pg_config --sharedir`/contrib/postgis-3.0.4/*_upgrade.sql
		

Load them all in turn, starting from postgis_upgrade.sql .

		psql -f postgis_upgrade.sql -d your_spatial_database
		

The same procedure applies to raster, topology and sfcgal extensions, with upgrade files named rtpostgis_upgrade.sql , topology_upgrade.sql and sfcgal_upgrade.sql respectively. If you need them:

psql -f rtpostgis_upgrade.sql -d your_spatial_database
psql -f topology_upgrade.sql -d your_spatial_database
psql -f sfcgal_upgrade.sql -d your_spatial_database
[Note]

If you can't find the postgis_upgrade.sql specific for upgrading your version you are using a version too early for a soft upgrade and need to do a HARD UPGRADE .

The PostGIS_Full_Version function should inform you about the need to run this kind of upgrade using a "procs need upgrade" message.

2.11.1.2. Soft Upgrade 9.1+ using extensions

If you originally installed PostGIS with extensions, then you need to upgrade using extensions as well. Doing a minor upgrade with extensions, is fairly painless.

ALTER EXTENSION postgis UPDATE TO "3.0.4";
ALTER EXTENSION postgis_topology UPDATE TO "3.0.4";

If you get an error notice something like:

No migration path defined for ... to 3.0.4

Then you'll need to backup your database, create a fresh one as described in Section 2.6, “Creating a spatial database using EXTENSIONS” and then restore your backup ontop of this new database.

If you get a notice message like:

Version "3.0.4" of extension "postgis" is already installed

Then everything is already up to date and you can safely ignore it. UNLESS you're attempting to upgrade from an development version to the next (which doesn't get a new version number); in that case you can append "next" to the version string, and next time you'll need to drop the "next" suffix again:

ALTER EXTENSION postgis UPDATE TO "3.0.4next";
ALTER EXTENSION postgis_topology UPDATE TO "3.0.4next";
[Note]

If you installed PostGIS originally without a version specified, you can often skip the reinstallation of postgis extension before restoring since the backup just has CREATE EXTENSION postgis and thus picks up the newest latest version during restore.

[Note]

If you are upgrading PostGIS extension from a version prior to 3.0.0 you'll end up with an unpackaged PostGIS Raster support. You can repackage the raster support using:

    CREATE EXTENSION postgis_raster FROM unpackaged;
    

And then, if you don't need it, drop it with:

    DROP EXTENSION postgis_raster;
    

2.11.2. Hard upgrade

By HARD UPGRADE we mean full dump/reload of postgis-enabled databases. You need a HARD UPGRADE when PostGIS objects' internal storage changes or when SOFT UPGRADE is not possible. The Release Notes appendix reports for each version whether you need a dump/reload (HARD UPGRADE) to upgrade.

The dump/reload process is assisted by the postgis_restore.pl script which takes care of skipping from the dump all definitions which belong to PostGIS (including old ones), allowing you to restore your schemas and data into a database with PostGIS installed without getting duplicate symbol errors or bringing forward deprecated objects.

Supplementary instructions for windows users are available at Windows Hard upgrade .

The Procedure is as follows:

  1. Create a "custom-format" dump of the database you want to upgrade (let's call it olddb ) include binary blobs (-b) and verbose (-v) output. The user can be the owner of the db, need not be postgres super account.

    pg_dump -h localhost -p 5432 -U postgres -Fc -b -v -f "/somepath/olddb.backup" olddb
  2. Do a fresh install of PostGIS in a new database -- we'll refer to this database as newdb . Please refer to Section 2.7, “Create a spatially-enabled database without using extensions” and Section 2.6, “Creating a spatial database using EXTENSIONS” for instructions on how to do this.

    The spatial_ref_sys entries found in your dump will be restored, but they will not override existing ones in spatial_ref_sys. This is to ensure that fixes in the official set will be properly propagated to restored databases. If for any reason you really want your own overrides of standard entries just don't load the spatial_ref_sys.sql file when creating the new db.

    If your database is really old or you know you've been using long deprecated functions in your views and functions, you might need to load legacy.sql for all your functions and views etc. to properly come back. Only do this if _really_ needed. Consider upgrading your views and functions before dumping instead, if possible. The deprecated functions can be later removed by loading uninstall_legacy.sql .

  3. Restore your backup into your fresh newdb database using postgis_restore.pl. Unexpected errors, if any, will be printed to the standard error stream by psql. Keep a log of those.

    perl utils/postgis_restore.pl "/somepath/olddb.backup" | psql -h localhost -p 5432 -U postgres newdb 2> errors.txt

Errors may arise in the following cases:

  1. Some of your views or functions make use of deprecated PostGIS objects. In order to fix this you may try loading legacy.sql script prior to restore or you'll have to restore to a version of PostGIS which still contains those objects and try a migration again after porting your code. If the legacy.sql way works for you, don't forget to fix your code to stop using deprecated functions and drop them loading uninstall_legacy.sql .

  2. Some custom records of spatial_ref_sys in dump file have an invalid SRID value. Valid SRID values are bigger than 0 and smaller than 999000. Values in the 999000.999999 range are reserved for internal use while values > 999999 can't be used at all. All your custom records with invalid SRIDs will be retained, with those > 999999 moved into the reserved range, but the spatial_ref_sys table would lose a check constraint guarding for that invariant to hold and possibly also its primary key ( when multiple invalid SRIDS get converted to the same reserved SRID value ).

    In order to fix this you should copy your custom SRS to a SRID with a valid value (maybe in the 910000..910999 range), convert all your tables to the new srid (see UpdateGeometrySRID ), delete the invalid entry from spatial_ref_sys and re-construct the check(s) with:

    ALTER TABLE spatial_ref_sys ADD CONSTRAINT spatial_ref_sys_srid_check check (srid > 0 AND srid < 999000 );

    ALTER TABLE spatial_ref_sys ADD PRIMARY KEY(srid));

    If you are upgrading an old database containing french IGN cartography, you will have probably SRIDs out of range and you will see, when importing your database, issues like this :

     WARNING: SRID 310642222 converted to 999175 (in reserved zone)

    In this case, you can try following steps : first throw out completely the IGN from the sql which is resulting from postgis_restore.pl. So, after having run :

    perl utils/postgis_restore.pl "/somepath/olddb.backup" > olddb.sql

    run this command :

    grep -v IGNF olddb.sql > olddb-without-IGN.sql

    Create then your newdb, activate the required Postgis extensions, and insert properly the french system IGN with : this script After these operations, import your data :

    psql -h localhost -p 5432 -U postgres -d newdb -f olddb-without-IGN.sql  2> errors.txt

2.12. Common Problems during installation

There are several things to check when your installation or upgrade doesn't go as you expected.

  1. Check that you have installed PostgreSQL 9.5 or newer, and that you are compiling against the same version of the PostgreSQL source as the version of PostgreSQL that is running. Mix-ups can occur when your (Linux) distribution has already installed PostgreSQL, or you have otherwise installed PostgreSQL before and forgotten about it. PostGIS will only work with PostgreSQL 9.5 or newer, and strange, unexpected error messages will result if you use an older version. To check the version of PostgreSQL which is running, connect to the database using psql and run this query:

    SELECT version();

    If you are running an RPM based distribution, you can check for the existence of pre-installed packages using the rpm command as follows: rpm -qa | grep postgresql

  2. If your upgrade fails, make sure you are restoring into a database that already has PostGIS installed.

    SELECT postgis_full_version();

Also check that configure has correctly detected the location and version of PostgreSQL, the Proj4 library and the GEOS library.

  1. The output from configure is used to generate the postgis_config.h file. Check that the POSTGIS_PGSQL_VERSION , POSTGIS_PROJ_VERSION and POSTGIS_GEOS_VERSION variables have been set correctly.

2.13. Loader/Dumper

The data loader and dumper are built and installed automatically as part of the PostGIS build. To build and install them manually:

# cd postgis-3.0.4/loader
# make
# make install

The loader is called shp2pgsql and converts ESRI Shape files into SQL suitable for loading in PostGIS/PostgreSQL. The dumper is called pgsql2shp and converts PostGIS tables (or queries) into ESRI Shape files. For more verbose documentation, see the online help, and the manual pages.