Skip to content

unknownv2/redshift-fake-driver

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

redshift-fake-driver

Build Status

redshift-fake-driver is a JDBC driver accepting Redshift specific commands (e.g. COPY, UNLOAD), which is useful for local testing.

The driver uses AWS SDK for Java to connect to S3, so you can use mocking libraries to emulate S3. (e.g. fake-s3)

Supported Redshift Commands

  • COPY
    • JSON with jsonpaths
    • MANIFEST
  • UNLOAD
  • DDLs
    • just drop Redshift specific directives (e.g. DISTSTYLE, DISTKEY, ...)

Some options are currently not supported.

Contributions are welcome!

Requirements

  • S3 (fake-s3 is OK)
  • aws-java-sdk-s3, aws-java-sdk-sts >= 1.10.8
  • postgresql or h2 JDBC driver
  • Java >= 1.7
  • (If you use in Scala projects) Scala 2.10.x / 2.11.x / 2.12.x(Java 1.8)

Usage (with Postgresql)

Setup Postgresql

  • Start postgresql database to mock Redshift.

Write an application

Example in Scala

Class.forName("jp.ne.opt.redshiftfake.postgres.FakePostgresqlDriver")

val endpoint = "http://localhost:9444/" // in production, scheme  will be "s3://"

val url = "jdbc:postgresqlredshift://localhost:5432/foo"
val prop = new Properties()
prop.setProperty("driver", "jp.ne.opt.redshiftfake.postgres.FakePostgresqlDriver")
prop.setProperty("user", "sa")

val conn = DriverManager.getConnection(url, prop)
val stmt = conn.createStatement()

val unloadSql =
  s"""unload ('select * from foo') to '${endpoint}foo/unloaded_'
      |credentials 'aws_access_key_id=AKIAXXXXXXXXXXXXXXX;aws_secret_access_key=YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY' 
      |manifest""".stripMargin
stmt.execute(unloadSql)

val copySql =
  s"""copy bar from '${endpoint}foo/unloaded_manifest'
      |credentials 'aws_access_key_id=AKIAXXXXXXXXXXXXXXX;aws_secret_access_key=YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY'
      |manifest""".stripMargin

stmt.execute(copySql)

Run the application.

With real S3

  • Specify region via fake.awsRegion system property.
    • us-east-1 by default

With fake-s3

  • Specify s3 endpoint via fake.awsS3Endpoint system property.
    • if you started a fake-s3 server on http://localhost:9444/, specify -Dfake.awsS3Endpoint="http://localhost:9444/". (trailing slash is required)
    • you do not have to change S3 URL in your SQL (copy/unload).

License

Apache 2.0

Original Project

The original project from Opt, Inc./ Opt Technologies can be found here.

About

A JDBC driver that emulates AWS Redshift specific commands

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •