What is remote viewing in Oracle?
In any case, if you can extract data from the table into a CSV file, you have another scenario. You can use the Python / boto / psycopg2 combination to script load your CSV on Amazon Redshift.
In my MySQL_To_Redshift_Loader, I do the following:
Extract data from MySQL to temp file.
loadConf=[ db_client_dbshell ,'-u', opt.mysql_user,'-p%s' % opt.mysql_pwd,'-D',opt.mysql_db_name, '-h', opt.mysql_db_server] ... q=""" %s %s INTO OUTFILE '%s' FIELDS TERMINATED BY '%s' ENCLOSED BY '%s' LINES TERMINATED BY '\r\n'; """ % (in_qry, limit, out_file, opt.mysql_col_delim,opt.mysql_quote) p1 = Popen(['echo', q], stdout=PIPE,stderr=PIPE,env=env) p2 = Popen(loadConf, stdin=p1.stdout, stdout=PIPE,stderr=PIPE) ...
S3 data compression and loading using boto Python module and multi-page loading.
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY) bucket = conn.get_bucket(bucket_name) k = Key(bucket) k.key = s3_key_name k.set_contents_from_file(file_handle, cb=progress, num_cb=20, reduced_redundancy=use_rr )
Use the psycopg2 COPY command to add data to a Redshift table.
sql=""" copy %s from '%s' CREDENTIALS 'aws_access_key_id=%s;aws_secret_access_key=%s' DELIMITER '%s' FORMAT CSV %s %s %s %s;""" % (opt.to_table, fn, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY,opt.delim,quote,gzip, timeformat, ignoreheader)
source share