I am trying to create a python script that does the following:
- Parsing a CSV file
- Send the CSV file to the remote server via the REST API
The code I have is working to parse a CSV file and convert it to a JSON object.
However, when importing to a remote server, only the first line is written.
Question: Do I need to iterate and send a SEPARATE http request for EACH ROW? I feel that would be too cumbersome. My CSV file has over 10,000 lines and this will work in daily cron.
Question. How can I configure this to import all rows in a single query?
#Requests package for python import requests import csv import json import requests #Parse CSV file and convert to JSON f = open('example_import_csv.csv', 'rU') reader = csv.DictReader(f, fieldnames = ("u_date","u_product","u_serial_number")) out = json.dumps([row for row in reader]) #Print output confirms that the JSON is formatted properly print(">JSON" , out) #Set request parameters url = 'xxxxx' user = 'xxxxxx' pwd = 'xxxxxx' #Set proper headers headers = {"Content-Type":"application/json","Accept":"application/json"} #data=out contains the JSON object #Problem is only the first row is imported response = requests.post(url, auth=(user, pwd), headers=headers ,data=out) #Check for HTTP codes other than 200 if response.status_code != 200: print('Status:', response.status_code, 'Headers:', response.headers) exit() #Decode the JSON response into a dictionary and use the data print('Status:',response.status_code,'Headers:',response.headers,'Response:',response.json()) ### OUTPUT # >JSON [{"u_serial_number": "11", "u_product": "Apples", "u_date": "1/12/15"}, {"u_serial_number": "12", "u_product": "Pears", "u_date": "1/29/15"}, {"u_serial_number": "13", "u_product": "Oranges", "u_date": "1/12/15"}, {"u_serial_number": "14", "u_product": "Blackberries", "u_date": "1/29/15"}, {"u_serial_number": "15", "u_product": "Blueberries", "u_date": "2/5/15"}, {"u_serial_number": "16", "u_product": "Bananas", "u_date": "2/7/15"}, {"u_serial_number": "17", "u_product": "Strawberries", "u_date": "2/7/15"}] # Status: 201 Headers: {'Content-Encoding': 'gzip', 'Transfer-Encoding': 'chunked', 'Content-Type': 'application/json'}
EDIT: I am sending data to our ServiceNow instance. Here is a wiki article that describes a python script template.
http://wiki.servicenow.com/index.php?title=Table_API_Python_Examples#gsc.tab=0
Here is the base block of code that I used as a template. Note that this works for a single row of data according to the example, but do not does not work when importing multiple rows of data.
#Need to install requests package for python #sudo easy_install requests import requests # Set the request parameters url = 'https://myinstance.service-now.com/api/now/table/incident' user = 'xxxxxxx' pwd = 'xxxxxxx' # Set proper headers headers = {"Content-Type":"application/json","Accept":"application/json"} # Do the HTTP request response = requests.post(url, auth=(user, pwd), headers=headers ,data='{"short_description":"Test"}') # Check for HTTP codes other than 200 if response.status_code != 201: print('Status:', response.status_code, 'Headers:', response.headers, 'Error Response:',response.json()) exit()