SDE to CSV slow transfer speedsOracle / SDE: ST_GEOMETRY or SDE Compressed BinaryListFeature classes in SDESDE State ID with PythonDelete tables from SDE including archive-tablesMonitor open sde connectionArcpy Slow to Read Datasets from Connection (.sde) FileSlow SDE Feature Class PerformanceArcPy - SDE Connection Causes “pythonw.exe” Stops WorkingAddRastersToMosaicDataset_management performs slowSwitching from Nested Search Cursors to Dictionaries
What does CI-V stand for?
Why not use SQL instead of GraphQL?
Why are electrically insulating heatsinks so rare? Is it just cost?
Is it tax fraud for an individual to declare non-taxable revenue as taxable income? (US tax laws)
Why doesn't H₄O²⁺ exist?
Risk of getting Chronic Wasting Disease (CWD) in the United States?
How to write a macro that is braces sensitive?
Approximately how much travel time was saved by the opening of the Suez Canal in 1869?
If I cast Expeditious Retreat, can I Dash as a bonus action on the same turn?
Mathematical cryptic clues
TGV timetables / schedules?
Why do I get two different answers for this counting problem?
Why did the Germans forbid the possession of pet pigeons in Rostov-on-Don in 1941?
How do we improve the relationship with a client software team that performs poorly and is becoming less collaborative?
Which models of the Boeing 737 are still in production?
Is it legal for company to use my work email to pretend I still work there?
Dragon forelimb placement
How is the claim "I am in New York only if I am in America" the same as "If I am in New York, then I am in America?
Can a Warlock become Neutral Good?
Why "Having chlorophyll without photosynthesis is actually very dangerous" and "like living with a bomb"?
What would happen to a modern skyscraper if it rains micro blackholes?
A newer friend of my brother's gave him a load of baseball cards that are supposedly extremely valuable. Is this a scam?
The use of multiple foreign keys on same column in SQL Server
What do you call a Matrix-like slowdown and camera movement effect?
SDE to CSV slow transfer speeds
Oracle / SDE: ST_GEOMETRY or SDE Compressed BinaryListFeature classes in SDESDE State ID with PythonDelete tables from SDE including archive-tablesMonitor open sde connectionArcpy Slow to Read Datasets from Connection (.sde) FileSlow SDE Feature Class PerformanceArcPy - SDE Connection Causes “pythonw.exe” Stops WorkingAddRastersToMosaicDataset_management performs slowSwitching from Nested Search Cursors to Dictionaries
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
I'm trying to move a feature class on our network SDE that's made up of polylines as a CSV file to the local machine (or fileGDB but slow speed seems to be the theme regardless of output format)
import arcpy, csv
import datetime
from arcpy import env
def tableToCSV(input_tbl, csv_filepath):
fld_list = arcpy.ListFields(input_tbl)
fld_names = [fld.name for fld in fld_list]
with open(csv_filepath, 'wb') as csv_file:
writer = csv.writer(csv_file)
writer.writerow(fld_names)
with arcpy.da.SearchCursor(input_tbl, fld_names) as cursor:
for row in cursor:
writer.writerow(row)
print csv_filepath + " CREATED"
csv_file.close()
env.workspace = r"Q:SDEDirect Connection to delivery.sde"
desiredFC = arcpy.ListFeatureClasses("","","Topography")
for fc in desiredFC:
if fc == 'Contours':
print 'Starting to Process/' + fc
startTime = datetime.datetime.now()
out_csv = "C:UsersCHOKDesktopExtract Data Method/" + fc[14:] + '.csv'
tableToCSV(fc, out_csv)
endTime = datetime.datetime.now()
print 'Finished Processing/' + fc
print 'It took ' + str(endTime - startTime) + ' to process ' + fc
raw_input('Press Enter to exit')
When I run the above script, I see that transferring 3.6 gb of data from the SDE to a CSV (or file geodatabase using arcpy copy methods) takes more than 3~4 hours to run.
I can see in task manager that the network speed utilised is usually between 0.3 Mbps to 2 Mbps but sometimes the speed jumps to 10~20 Mbps and other feature classes are transferred at around 5~7 Mbps.
Is there something I can try to improve the speed at which SDE stored on the network is copied over to a local drive or is this a lost cause due to the network currently in place?
arcpy enterprise-geodatabase
New contributor
add a comment |
I'm trying to move a feature class on our network SDE that's made up of polylines as a CSV file to the local machine (or fileGDB but slow speed seems to be the theme regardless of output format)
import arcpy, csv
import datetime
from arcpy import env
def tableToCSV(input_tbl, csv_filepath):
fld_list = arcpy.ListFields(input_tbl)
fld_names = [fld.name for fld in fld_list]
with open(csv_filepath, 'wb') as csv_file:
writer = csv.writer(csv_file)
writer.writerow(fld_names)
with arcpy.da.SearchCursor(input_tbl, fld_names) as cursor:
for row in cursor:
writer.writerow(row)
print csv_filepath + " CREATED"
csv_file.close()
env.workspace = r"Q:SDEDirect Connection to delivery.sde"
desiredFC = arcpy.ListFeatureClasses("","","Topography")
for fc in desiredFC:
if fc == 'Contours':
print 'Starting to Process/' + fc
startTime = datetime.datetime.now()
out_csv = "C:UsersCHOKDesktopExtract Data Method/" + fc[14:] + '.csv'
tableToCSV(fc, out_csv)
endTime = datetime.datetime.now()
print 'Finished Processing/' + fc
print 'It took ' + str(endTime - startTime) + ' to process ' + fc
raw_input('Press Enter to exit')
When I run the above script, I see that transferring 3.6 gb of data from the SDE to a CSV (or file geodatabase using arcpy copy methods) takes more than 3~4 hours to run.
I can see in task manager that the network speed utilised is usually between 0.3 Mbps to 2 Mbps but sometimes the speed jumps to 10~20 Mbps and other feature classes are transferred at around 5~7 Mbps.
Is there something I can try to improve the speed at which SDE stored on the network is copied over to a local drive or is this a lost cause due to the network currently in place?
arcpy enterprise-geodatabase
New contributor
That's probably because your HDD is thrashing writing to CSV from a feature class - open your resource monitor to see how hard your drive is being pushed. I would think a file geodatabase would be better to write to, a shapefile is definitely not an option due to size, you could create a replica perhaps; what is your underlying database for your SDE? What else is running on that server? when was it last restarted? Have you tried running the export on the server instead of on a remote workstation to negate the network speed?
– Michael Stimson
Apr 3 at 5:23
The disk writing is at 1% so I don't think it's a hard drive issue. Attempts to write to a new file gdb also seems to yield similar speeds. Unfortunately don't have authority to the actual SDE (located in different location and manged by different team)
– Andy
Apr 3 at 5:31
That often happens. Have you tried to create replica resources.arcgis.com/en/help/main/10.2/index.html#//…, it should be optimized for this sort of thing but it may also create a version, which you may not have permission to do, otherwise Copy Features resources.arcgis.com/en/help/main/10.2/index.html#//… or Feature Class to Feature Class resources.arcgis.com/en/help/main/10.2/index.html#//… should be faster, if not then it's a server/network issue - contact the team that does administer the database and see what's up
– Michael Stimson
Apr 3 at 5:37
A few years back I needed to use ASCII to export and import several hundred million rows. I usedarcpy.da.SearchCursor
andcsv
, and the export to 68Gb took 20 minutes and the import two hours. 4Gb is a drop in that bucket and should only take minutes, which makes me wonder if you're using an IO-limited resource like an RDS instance for the Enterprise geodatabase (there is no longer anything called SDE).
– Vince
Apr 3 at 11:21
I think this is an XY Problem because the root issue is poor performance of the data itself, not the access methodology. If you intersect your contours with a coarse fishnet, then Dissolve or dissolve on the attribute used for rendering, the feature count will be reduced and the spatial index more effective. The fact that the data is in a remote data Center may be a contributing factor, as may fragmentation and uncompressed version tree, but for contours, you have an additional issue.
– Vince
Apr 3 at 11:54
add a comment |
I'm trying to move a feature class on our network SDE that's made up of polylines as a CSV file to the local machine (or fileGDB but slow speed seems to be the theme regardless of output format)
import arcpy, csv
import datetime
from arcpy import env
def tableToCSV(input_tbl, csv_filepath):
fld_list = arcpy.ListFields(input_tbl)
fld_names = [fld.name for fld in fld_list]
with open(csv_filepath, 'wb') as csv_file:
writer = csv.writer(csv_file)
writer.writerow(fld_names)
with arcpy.da.SearchCursor(input_tbl, fld_names) as cursor:
for row in cursor:
writer.writerow(row)
print csv_filepath + " CREATED"
csv_file.close()
env.workspace = r"Q:SDEDirect Connection to delivery.sde"
desiredFC = arcpy.ListFeatureClasses("","","Topography")
for fc in desiredFC:
if fc == 'Contours':
print 'Starting to Process/' + fc
startTime = datetime.datetime.now()
out_csv = "C:UsersCHOKDesktopExtract Data Method/" + fc[14:] + '.csv'
tableToCSV(fc, out_csv)
endTime = datetime.datetime.now()
print 'Finished Processing/' + fc
print 'It took ' + str(endTime - startTime) + ' to process ' + fc
raw_input('Press Enter to exit')
When I run the above script, I see that transferring 3.6 gb of data from the SDE to a CSV (or file geodatabase using arcpy copy methods) takes more than 3~4 hours to run.
I can see in task manager that the network speed utilised is usually between 0.3 Mbps to 2 Mbps but sometimes the speed jumps to 10~20 Mbps and other feature classes are transferred at around 5~7 Mbps.
Is there something I can try to improve the speed at which SDE stored on the network is copied over to a local drive or is this a lost cause due to the network currently in place?
arcpy enterprise-geodatabase
New contributor
I'm trying to move a feature class on our network SDE that's made up of polylines as a CSV file to the local machine (or fileGDB but slow speed seems to be the theme regardless of output format)
import arcpy, csv
import datetime
from arcpy import env
def tableToCSV(input_tbl, csv_filepath):
fld_list = arcpy.ListFields(input_tbl)
fld_names = [fld.name for fld in fld_list]
with open(csv_filepath, 'wb') as csv_file:
writer = csv.writer(csv_file)
writer.writerow(fld_names)
with arcpy.da.SearchCursor(input_tbl, fld_names) as cursor:
for row in cursor:
writer.writerow(row)
print csv_filepath + " CREATED"
csv_file.close()
env.workspace = r"Q:SDEDirect Connection to delivery.sde"
desiredFC = arcpy.ListFeatureClasses("","","Topography")
for fc in desiredFC:
if fc == 'Contours':
print 'Starting to Process/' + fc
startTime = datetime.datetime.now()
out_csv = "C:UsersCHOKDesktopExtract Data Method/" + fc[14:] + '.csv'
tableToCSV(fc, out_csv)
endTime = datetime.datetime.now()
print 'Finished Processing/' + fc
print 'It took ' + str(endTime - startTime) + ' to process ' + fc
raw_input('Press Enter to exit')
When I run the above script, I see that transferring 3.6 gb of data from the SDE to a CSV (or file geodatabase using arcpy copy methods) takes more than 3~4 hours to run.
I can see in task manager that the network speed utilised is usually between 0.3 Mbps to 2 Mbps but sometimes the speed jumps to 10~20 Mbps and other feature classes are transferred at around 5~7 Mbps.
Is there something I can try to improve the speed at which SDE stored on the network is copied over to a local drive or is this a lost cause due to the network currently in place?
arcpy enterprise-geodatabase
arcpy enterprise-geodatabase
New contributor
New contributor
edited Apr 3 at 11:19
Vince
14.8k32849
14.8k32849
New contributor
asked Apr 3 at 5:17
AndyAndy
141
141
New contributor
New contributor
That's probably because your HDD is thrashing writing to CSV from a feature class - open your resource monitor to see how hard your drive is being pushed. I would think a file geodatabase would be better to write to, a shapefile is definitely not an option due to size, you could create a replica perhaps; what is your underlying database for your SDE? What else is running on that server? when was it last restarted? Have you tried running the export on the server instead of on a remote workstation to negate the network speed?
– Michael Stimson
Apr 3 at 5:23
The disk writing is at 1% so I don't think it's a hard drive issue. Attempts to write to a new file gdb also seems to yield similar speeds. Unfortunately don't have authority to the actual SDE (located in different location and manged by different team)
– Andy
Apr 3 at 5:31
That often happens. Have you tried to create replica resources.arcgis.com/en/help/main/10.2/index.html#//…, it should be optimized for this sort of thing but it may also create a version, which you may not have permission to do, otherwise Copy Features resources.arcgis.com/en/help/main/10.2/index.html#//… or Feature Class to Feature Class resources.arcgis.com/en/help/main/10.2/index.html#//… should be faster, if not then it's a server/network issue - contact the team that does administer the database and see what's up
– Michael Stimson
Apr 3 at 5:37
A few years back I needed to use ASCII to export and import several hundred million rows. I usedarcpy.da.SearchCursor
andcsv
, and the export to 68Gb took 20 minutes and the import two hours. 4Gb is a drop in that bucket and should only take minutes, which makes me wonder if you're using an IO-limited resource like an RDS instance for the Enterprise geodatabase (there is no longer anything called SDE).
– Vince
Apr 3 at 11:21
I think this is an XY Problem because the root issue is poor performance of the data itself, not the access methodology. If you intersect your contours with a coarse fishnet, then Dissolve or dissolve on the attribute used for rendering, the feature count will be reduced and the spatial index more effective. The fact that the data is in a remote data Center may be a contributing factor, as may fragmentation and uncompressed version tree, but for contours, you have an additional issue.
– Vince
Apr 3 at 11:54
add a comment |
That's probably because your HDD is thrashing writing to CSV from a feature class - open your resource monitor to see how hard your drive is being pushed. I would think a file geodatabase would be better to write to, a shapefile is definitely not an option due to size, you could create a replica perhaps; what is your underlying database for your SDE? What else is running on that server? when was it last restarted? Have you tried running the export on the server instead of on a remote workstation to negate the network speed?
– Michael Stimson
Apr 3 at 5:23
The disk writing is at 1% so I don't think it's a hard drive issue. Attempts to write to a new file gdb also seems to yield similar speeds. Unfortunately don't have authority to the actual SDE (located in different location and manged by different team)
– Andy
Apr 3 at 5:31
That often happens. Have you tried to create replica resources.arcgis.com/en/help/main/10.2/index.html#//…, it should be optimized for this sort of thing but it may also create a version, which you may not have permission to do, otherwise Copy Features resources.arcgis.com/en/help/main/10.2/index.html#//… or Feature Class to Feature Class resources.arcgis.com/en/help/main/10.2/index.html#//… should be faster, if not then it's a server/network issue - contact the team that does administer the database and see what's up
– Michael Stimson
Apr 3 at 5:37
A few years back I needed to use ASCII to export and import several hundred million rows. I usedarcpy.da.SearchCursor
andcsv
, and the export to 68Gb took 20 minutes and the import two hours. 4Gb is a drop in that bucket and should only take minutes, which makes me wonder if you're using an IO-limited resource like an RDS instance for the Enterprise geodatabase (there is no longer anything called SDE).
– Vince
Apr 3 at 11:21
I think this is an XY Problem because the root issue is poor performance of the data itself, not the access methodology. If you intersect your contours with a coarse fishnet, then Dissolve or dissolve on the attribute used for rendering, the feature count will be reduced and the spatial index more effective. The fact that the data is in a remote data Center may be a contributing factor, as may fragmentation and uncompressed version tree, but for contours, you have an additional issue.
– Vince
Apr 3 at 11:54
That's probably because your HDD is thrashing writing to CSV from a feature class - open your resource monitor to see how hard your drive is being pushed. I would think a file geodatabase would be better to write to, a shapefile is definitely not an option due to size, you could create a replica perhaps; what is your underlying database for your SDE? What else is running on that server? when was it last restarted? Have you tried running the export on the server instead of on a remote workstation to negate the network speed?
– Michael Stimson
Apr 3 at 5:23
That's probably because your HDD is thrashing writing to CSV from a feature class - open your resource monitor to see how hard your drive is being pushed. I would think a file geodatabase would be better to write to, a shapefile is definitely not an option due to size, you could create a replica perhaps; what is your underlying database for your SDE? What else is running on that server? when was it last restarted? Have you tried running the export on the server instead of on a remote workstation to negate the network speed?
– Michael Stimson
Apr 3 at 5:23
The disk writing is at 1% so I don't think it's a hard drive issue. Attempts to write to a new file gdb also seems to yield similar speeds. Unfortunately don't have authority to the actual SDE (located in different location and manged by different team)
– Andy
Apr 3 at 5:31
The disk writing is at 1% so I don't think it's a hard drive issue. Attempts to write to a new file gdb also seems to yield similar speeds. Unfortunately don't have authority to the actual SDE (located in different location and manged by different team)
– Andy
Apr 3 at 5:31
That often happens. Have you tried to create replica resources.arcgis.com/en/help/main/10.2/index.html#//…, it should be optimized for this sort of thing but it may also create a version, which you may not have permission to do, otherwise Copy Features resources.arcgis.com/en/help/main/10.2/index.html#//… or Feature Class to Feature Class resources.arcgis.com/en/help/main/10.2/index.html#//… should be faster, if not then it's a server/network issue - contact the team that does administer the database and see what's up
– Michael Stimson
Apr 3 at 5:37
That often happens. Have you tried to create replica resources.arcgis.com/en/help/main/10.2/index.html#//…, it should be optimized for this sort of thing but it may also create a version, which you may not have permission to do, otherwise Copy Features resources.arcgis.com/en/help/main/10.2/index.html#//… or Feature Class to Feature Class resources.arcgis.com/en/help/main/10.2/index.html#//… should be faster, if not then it's a server/network issue - contact the team that does administer the database and see what's up
– Michael Stimson
Apr 3 at 5:37
A few years back I needed to use ASCII to export and import several hundred million rows. I used
arcpy.da.SearchCursor
and csv
, and the export to 68Gb took 20 minutes and the import two hours. 4Gb is a drop in that bucket and should only take minutes, which makes me wonder if you're using an IO-limited resource like an RDS instance for the Enterprise geodatabase (there is no longer anything called SDE).– Vince
Apr 3 at 11:21
A few years back I needed to use ASCII to export and import several hundred million rows. I used
arcpy.da.SearchCursor
and csv
, and the export to 68Gb took 20 minutes and the import two hours. 4Gb is a drop in that bucket and should only take minutes, which makes me wonder if you're using an IO-limited resource like an RDS instance for the Enterprise geodatabase (there is no longer anything called SDE).– Vince
Apr 3 at 11:21
I think this is an XY Problem because the root issue is poor performance of the data itself, not the access methodology. If you intersect your contours with a coarse fishnet, then Dissolve or dissolve on the attribute used for rendering, the feature count will be reduced and the spatial index more effective. The fact that the data is in a remote data Center may be a contributing factor, as may fragmentation and uncompressed version tree, but for contours, you have an additional issue.
– Vince
Apr 3 at 11:54
I think this is an XY Problem because the root issue is poor performance of the data itself, not the access methodology. If you intersect your contours with a coarse fishnet, then Dissolve or dissolve on the attribute used for rendering, the feature count will be reduced and the spatial index more effective. The fact that the data is in a remote data Center may be a contributing factor, as may fragmentation and uncompressed version tree, but for contours, you have an additional issue.
– Vince
Apr 3 at 11:54
add a comment |
1 Answer
1
active
oldest
votes
I believe this is happening because you are using cursors for your data fetching operation, I had encountered same behavior in my past. I would suggest you to use plain SQL statements for query. Use this pseudo python code. Once you get result set then you can iterate through result set and save in CSV.
ArcSDESQLExecute.execute(sql_statement)
Now since result set in plain format, you can perform write operation with much faster speed.
.
ArcSDESQLExecute
import arcpy
# Use a connection file to create the connection
db = r'Database ConnectionssDEConnection_pointing_to_your_version.sde'
egdb_conn = arcpy.ArcSDESQLExecute(db)
table_name = 'YourFeatureClassName'
field_name = 'FieldName'
# don't select the Shape field until its necessary as Shape stored in BLOB field which take time to process.
sql = '''
SELECT * FROM 0
'''.format(table_name)
egdb_return = egdb_conn.execute(sql)
for i in egdb_return:
print(': '.format(*i))
A simple database cursor is not going to be version-aware, so your answer should mention this. The Python code is pretty basic, so giving a C# example seems confusing.
– Vince
Apr 3 at 11:16
Agree, I will update my answer.
– Bhaskar Singh
Apr 3 at 12:10
Thanks for the help, I specified my db and have table_name = 'TopographyContour_Other' and field_name = 'load_date' when I run the rest of the code the sql is shown as 'nSELECT * FROM Topography\Contour_Othern' and egdb_return = egdb_conn.execute(sql) returns this error AttributeError: ArcSDESQLExecute: StreamPrepareSQL ArcSDE Extended error -202 [Informix][Informix ODBC Driver][Informix]An illegal character has been found in the statement.
– Andy
Apr 3 at 21:59
Are you storing your user name and passwords in SDE file along with version information. If not then you must have to specify the username and password explicitly. "ArcSDESQLExecute (server, instance, database, user, password)". Also you must be setting "env.workspace". after that you would be executing SQL statement again of your table/feature class. in that case you need to pass feature class or table name only, not full path.
– Bhaskar Singh
2 days ago
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "79"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Andy is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fgis.stackexchange.com%2fquestions%2f317588%2fsde-to-csv-slow-transfer-speeds%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
I believe this is happening because you are using cursors for your data fetching operation, I had encountered same behavior in my past. I would suggest you to use plain SQL statements for query. Use this pseudo python code. Once you get result set then you can iterate through result set and save in CSV.
ArcSDESQLExecute.execute(sql_statement)
Now since result set in plain format, you can perform write operation with much faster speed.
.
ArcSDESQLExecute
import arcpy
# Use a connection file to create the connection
db = r'Database ConnectionssDEConnection_pointing_to_your_version.sde'
egdb_conn = arcpy.ArcSDESQLExecute(db)
table_name = 'YourFeatureClassName'
field_name = 'FieldName'
# don't select the Shape field until its necessary as Shape stored in BLOB field which take time to process.
sql = '''
SELECT * FROM 0
'''.format(table_name)
egdb_return = egdb_conn.execute(sql)
for i in egdb_return:
print(': '.format(*i))
A simple database cursor is not going to be version-aware, so your answer should mention this. The Python code is pretty basic, so giving a C# example seems confusing.
– Vince
Apr 3 at 11:16
Agree, I will update my answer.
– Bhaskar Singh
Apr 3 at 12:10
Thanks for the help, I specified my db and have table_name = 'TopographyContour_Other' and field_name = 'load_date' when I run the rest of the code the sql is shown as 'nSELECT * FROM Topography\Contour_Othern' and egdb_return = egdb_conn.execute(sql) returns this error AttributeError: ArcSDESQLExecute: StreamPrepareSQL ArcSDE Extended error -202 [Informix][Informix ODBC Driver][Informix]An illegal character has been found in the statement.
– Andy
Apr 3 at 21:59
Are you storing your user name and passwords in SDE file along with version information. If not then you must have to specify the username and password explicitly. "ArcSDESQLExecute (server, instance, database, user, password)". Also you must be setting "env.workspace". after that you would be executing SQL statement again of your table/feature class. in that case you need to pass feature class or table name only, not full path.
– Bhaskar Singh
2 days ago
add a comment |
I believe this is happening because you are using cursors for your data fetching operation, I had encountered same behavior in my past. I would suggest you to use plain SQL statements for query. Use this pseudo python code. Once you get result set then you can iterate through result set and save in CSV.
ArcSDESQLExecute.execute(sql_statement)
Now since result set in plain format, you can perform write operation with much faster speed.
.
ArcSDESQLExecute
import arcpy
# Use a connection file to create the connection
db = r'Database ConnectionssDEConnection_pointing_to_your_version.sde'
egdb_conn = arcpy.ArcSDESQLExecute(db)
table_name = 'YourFeatureClassName'
field_name = 'FieldName'
# don't select the Shape field until its necessary as Shape stored in BLOB field which take time to process.
sql = '''
SELECT * FROM 0
'''.format(table_name)
egdb_return = egdb_conn.execute(sql)
for i in egdb_return:
print(': '.format(*i))
A simple database cursor is not going to be version-aware, so your answer should mention this. The Python code is pretty basic, so giving a C# example seems confusing.
– Vince
Apr 3 at 11:16
Agree, I will update my answer.
– Bhaskar Singh
Apr 3 at 12:10
Thanks for the help, I specified my db and have table_name = 'TopographyContour_Other' and field_name = 'load_date' when I run the rest of the code the sql is shown as 'nSELECT * FROM Topography\Contour_Othern' and egdb_return = egdb_conn.execute(sql) returns this error AttributeError: ArcSDESQLExecute: StreamPrepareSQL ArcSDE Extended error -202 [Informix][Informix ODBC Driver][Informix]An illegal character has been found in the statement.
– Andy
Apr 3 at 21:59
Are you storing your user name and passwords in SDE file along with version information. If not then you must have to specify the username and password explicitly. "ArcSDESQLExecute (server, instance, database, user, password)". Also you must be setting "env.workspace". after that you would be executing SQL statement again of your table/feature class. in that case you need to pass feature class or table name only, not full path.
– Bhaskar Singh
2 days ago
add a comment |
I believe this is happening because you are using cursors for your data fetching operation, I had encountered same behavior in my past. I would suggest you to use plain SQL statements for query. Use this pseudo python code. Once you get result set then you can iterate through result set and save in CSV.
ArcSDESQLExecute.execute(sql_statement)
Now since result set in plain format, you can perform write operation with much faster speed.
.
ArcSDESQLExecute
import arcpy
# Use a connection file to create the connection
db = r'Database ConnectionssDEConnection_pointing_to_your_version.sde'
egdb_conn = arcpy.ArcSDESQLExecute(db)
table_name = 'YourFeatureClassName'
field_name = 'FieldName'
# don't select the Shape field until its necessary as Shape stored in BLOB field which take time to process.
sql = '''
SELECT * FROM 0
'''.format(table_name)
egdb_return = egdb_conn.execute(sql)
for i in egdb_return:
print(': '.format(*i))
I believe this is happening because you are using cursors for your data fetching operation, I had encountered same behavior in my past. I would suggest you to use plain SQL statements for query. Use this pseudo python code. Once you get result set then you can iterate through result set and save in CSV.
ArcSDESQLExecute.execute(sql_statement)
Now since result set in plain format, you can perform write operation with much faster speed.
.
ArcSDESQLExecute
import arcpy
# Use a connection file to create the connection
db = r'Database ConnectionssDEConnection_pointing_to_your_version.sde'
egdb_conn = arcpy.ArcSDESQLExecute(db)
table_name = 'YourFeatureClassName'
field_name = 'FieldName'
# don't select the Shape field until its necessary as Shape stored in BLOB field which take time to process.
sql = '''
SELECT * FROM 0
'''.format(table_name)
egdb_return = egdb_conn.execute(sql)
for i in egdb_return:
print(': '.format(*i))
edited Apr 3 at 12:33
answered Apr 3 at 5:44
Bhaskar SinghBhaskar Singh
466
466
A simple database cursor is not going to be version-aware, so your answer should mention this. The Python code is pretty basic, so giving a C# example seems confusing.
– Vince
Apr 3 at 11:16
Agree, I will update my answer.
– Bhaskar Singh
Apr 3 at 12:10
Thanks for the help, I specified my db and have table_name = 'TopographyContour_Other' and field_name = 'load_date' when I run the rest of the code the sql is shown as 'nSELECT * FROM Topography\Contour_Othern' and egdb_return = egdb_conn.execute(sql) returns this error AttributeError: ArcSDESQLExecute: StreamPrepareSQL ArcSDE Extended error -202 [Informix][Informix ODBC Driver][Informix]An illegal character has been found in the statement.
– Andy
Apr 3 at 21:59
Are you storing your user name and passwords in SDE file along with version information. If not then you must have to specify the username and password explicitly. "ArcSDESQLExecute (server, instance, database, user, password)". Also you must be setting "env.workspace". after that you would be executing SQL statement again of your table/feature class. in that case you need to pass feature class or table name only, not full path.
– Bhaskar Singh
2 days ago
add a comment |
A simple database cursor is not going to be version-aware, so your answer should mention this. The Python code is pretty basic, so giving a C# example seems confusing.
– Vince
Apr 3 at 11:16
Agree, I will update my answer.
– Bhaskar Singh
Apr 3 at 12:10
Thanks for the help, I specified my db and have table_name = 'TopographyContour_Other' and field_name = 'load_date' when I run the rest of the code the sql is shown as 'nSELECT * FROM Topography\Contour_Othern' and egdb_return = egdb_conn.execute(sql) returns this error AttributeError: ArcSDESQLExecute: StreamPrepareSQL ArcSDE Extended error -202 [Informix][Informix ODBC Driver][Informix]An illegal character has been found in the statement.
– Andy
Apr 3 at 21:59
Are you storing your user name and passwords in SDE file along with version information. If not then you must have to specify the username and password explicitly. "ArcSDESQLExecute (server, instance, database, user, password)". Also you must be setting "env.workspace". after that you would be executing SQL statement again of your table/feature class. in that case you need to pass feature class or table name only, not full path.
– Bhaskar Singh
2 days ago
A simple database cursor is not going to be version-aware, so your answer should mention this. The Python code is pretty basic, so giving a C# example seems confusing.
– Vince
Apr 3 at 11:16
A simple database cursor is not going to be version-aware, so your answer should mention this. The Python code is pretty basic, so giving a C# example seems confusing.
– Vince
Apr 3 at 11:16
Agree, I will update my answer.
– Bhaskar Singh
Apr 3 at 12:10
Agree, I will update my answer.
– Bhaskar Singh
Apr 3 at 12:10
Thanks for the help, I specified my db and have table_name = 'TopographyContour_Other' and field_name = 'load_date' when I run the rest of the code the sql is shown as 'nSELECT * FROM Topography\Contour_Othern' and egdb_return = egdb_conn.execute(sql) returns this error AttributeError: ArcSDESQLExecute: StreamPrepareSQL ArcSDE Extended error -202 [Informix][Informix ODBC Driver][Informix]An illegal character has been found in the statement.
– Andy
Apr 3 at 21:59
Thanks for the help, I specified my db and have table_name = 'TopographyContour_Other' and field_name = 'load_date' when I run the rest of the code the sql is shown as 'nSELECT * FROM Topography\Contour_Othern' and egdb_return = egdb_conn.execute(sql) returns this error AttributeError: ArcSDESQLExecute: StreamPrepareSQL ArcSDE Extended error -202 [Informix][Informix ODBC Driver][Informix]An illegal character has been found in the statement.
– Andy
Apr 3 at 21:59
Are you storing your user name and passwords in SDE file along with version information. If not then you must have to specify the username and password explicitly. "ArcSDESQLExecute (server, instance, database, user, password)". Also you must be setting "env.workspace". after that you would be executing SQL statement again of your table/feature class. in that case you need to pass feature class or table name only, not full path.
– Bhaskar Singh
2 days ago
Are you storing your user name and passwords in SDE file along with version information. If not then you must have to specify the username and password explicitly. "ArcSDESQLExecute (server, instance, database, user, password)". Also you must be setting "env.workspace". after that you would be executing SQL statement again of your table/feature class. in that case you need to pass feature class or table name only, not full path.
– Bhaskar Singh
2 days ago
add a comment |
Andy is a new contributor. Be nice, and check out our Code of Conduct.
Andy is a new contributor. Be nice, and check out our Code of Conduct.
Andy is a new contributor. Be nice, and check out our Code of Conduct.
Andy is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Geographic Information Systems Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fgis.stackexchange.com%2fquestions%2f317588%2fsde-to-csv-slow-transfer-speeds%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
That's probably because your HDD is thrashing writing to CSV from a feature class - open your resource monitor to see how hard your drive is being pushed. I would think a file geodatabase would be better to write to, a shapefile is definitely not an option due to size, you could create a replica perhaps; what is your underlying database for your SDE? What else is running on that server? when was it last restarted? Have you tried running the export on the server instead of on a remote workstation to negate the network speed?
– Michael Stimson
Apr 3 at 5:23
The disk writing is at 1% so I don't think it's a hard drive issue. Attempts to write to a new file gdb also seems to yield similar speeds. Unfortunately don't have authority to the actual SDE (located in different location and manged by different team)
– Andy
Apr 3 at 5:31
That often happens. Have you tried to create replica resources.arcgis.com/en/help/main/10.2/index.html#//…, it should be optimized for this sort of thing but it may also create a version, which you may not have permission to do, otherwise Copy Features resources.arcgis.com/en/help/main/10.2/index.html#//… or Feature Class to Feature Class resources.arcgis.com/en/help/main/10.2/index.html#//… should be faster, if not then it's a server/network issue - contact the team that does administer the database and see what's up
– Michael Stimson
Apr 3 at 5:37
A few years back I needed to use ASCII to export and import several hundred million rows. I used
arcpy.da.SearchCursor
andcsv
, and the export to 68Gb took 20 minutes and the import two hours. 4Gb is a drop in that bucket and should only take minutes, which makes me wonder if you're using an IO-limited resource like an RDS instance for the Enterprise geodatabase (there is no longer anything called SDE).– Vince
Apr 3 at 11:21
I think this is an XY Problem because the root issue is poor performance of the data itself, not the access methodology. If you intersect your contours with a coarse fishnet, then Dissolve or dissolve on the attribute used for rendering, the feature count will be reduced and the spatial index more effective. The fact that the data is in a remote data Center may be a contributing factor, as may fragmentation and uncompressed version tree, but for contours, you have an additional issue.
– Vince
Apr 3 at 11:54