[SI-1091] add entry for address import in architectureDocumentation.adoc

Signed-off-by: Holger Rudolph <holger.rudolph@pta.de>
diff --git a/src/main/asciidoc/architectureDocumentation/architectureDocumentation.adoc b/src/main/asciidoc/architectureDocumentation/architectureDocumentation.adoc
index ae317bf..a146f71 100644
--- a/src/main/asciidoc/architectureDocumentation/architectureDocumentation.adoc
+++ b/src/main/asciidoc/architectureDocumentation/architectureDocumentation.adoc
@@ -747,7 +747,7 @@
 to setup the RabbitMQ configuration correctly.
 
 A client, that wants to use the message queue to import data, has to use
-the correct queue/channel configuration.In addition, the following values ​​must be entered as message headers:
+the correct queue/channel configuration. In addition, the following values ​​must be entered as message headers:
 * "*metaId*" Unique id out of the foreign system.
 For each metaId from an external system
 only one failure information object is ever created. If an existing metaId is sent again,
@@ -805,6 +805,82 @@
 Please refer to the subproject "*test Import Grid Failures*" of the backend repository, for an example for
 sending Data over the message queue.
 
+==== Import of addresses from CSV files
+Addresses can be imported from CSV files with the addressImport service.
+The import of general addresses and addresses for connections of power, gas and water are supported. Also addresses from power stations are imported.
+UTM coordinates from the sources are converted into latitude and longitude.
+Addresses with the same id are stored in one data row.
+
+Location and filename can be configured in the * .yaml files, which are located in the JAR file of the addressImport service:
+
+[#configuration-section-services]
+_Address Import configuration_
+
+* *adressimport.cleanup* If enabled=true the import service deletes all data from the address table before import.
+* *adressimport.cron* cron job parameter
+* *adressimport.file.addresses* directory and name of the import file for general addresses
+* *adressimport.file.power-connections* directory and name of the import file for addresses with power connections
+* *adressimport.file.gas-connections* directory and name of the import file for addresses with gas connections
+* *adressimport.file.water-connections* directory and name of the import file for addresses with water connections
+* *adressimport.file.power-stations* directory and name of the import file for power stations addresses
+
+* *utm.zoneNumber* Number of UTM zone
+* *utm.zoneLetter* Letter of UTM zone
+
+===== Source file requirements
+
+The CSV files must contain the following structure:
+
+====== General addresses
+1. column: utm coordinates easting
+2. column: utm coordinates northing
+3. column: address id
+4. column: postcode
+5. column: community
+6. column: district
+7. column: street
+8. column: housenumber
+
+====== Addresses with power connections
+1. column: utm coordinates easting
+2. column: utm coordinates northing
+3. column: address id
+4. column: postcode
+5. column: community
+6. column: district
+7. column: street
+8. column: housenumber
+9. column: power station number
+
+====== Addresses with gas connections
+1. column: utm coordinates easting
+2. column: utm coordinates northing
+3. column: address id
+4. column: postcode
+5. column: community
+6. column: district
+7. column: street
+8. column: housenumber
+9. column: group
+
+====== Addresses with water connections
+1. column: utm coordinates easting
+2. column: utm coordinates northing
+3. column: address id
+4. column: postcode
+5. column: community
+6. column: district
+7. column: street
+8. column: housenumber
+9. column: group
+
+====== Addresses for power stations
+1. column: utm coordinates easting
+2. column: utm coordinates northing
+3. column: address id
+4. column: power station number
+5. column: power station name
+
 === Deployment of the application components
 
 ==== Deployment of the frontend