I am using the pyKML module to extract coordinates from a given KML file.
My Python code is as follows:
from pykml import parser fileobject = parser.fromstring(open('MapSource.kml', 'r').read()) root = parser.parse(fileobject).getroot() print(xml.Document.Placemark.Point.coordinates)
However, when you run this function, the following error occurs:
ValueError: Unicode strings with encoding declaration are not supported. Please use bytes input or XML fragments without declaration.
Looking for solutions, I came across this solution http://twigstechtips.blogspot.in/2013/06/python-lxml-strings-with-encoding.html , where I tried it from (which I'm not sure is the correct method):
from pykml import parser from lxml import etree from os import path kml_file = open('MapSource.kml', 'r') parser = etree.XMLParser(recover=True) xml = etree.fromstring(kml_file, parser) print(xml.Document.Placemark.Point.coordinates)
This gives me ValueError: can only parse strings . What is the right way for me to parse KML and get the coordinates in this structure?
Newtt source share