You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using spark-solr to write CSV data into solr collection.
Sometimes this CSV might consists of additional fields that they do not want to index to solr collection. Hence we made the solr schema as immutable: <schemaFactory class="ManagedIndexSchemaFactory"> <bool name="mutable">false</bool> <str name="managedSchemaResourceName">managed-schema</str> </schemaFactory>
This work alright until we have all the columns in CSV defined in Solr(manually). The moment we have some unexpected fields in CSV. Solr-spark tries to create them and the job fails as there is no check if the schema is immutable. As I understand CSV data( all columns) will be loaded as is with scala> var csvDF= spark.read.format("com.databricks.spark.csv").option("header", "true").load(csvFileLocation)
Feature request: config flag in Solr-spark that says collection schema is immutable.
The text was updated successfully, but these errors were encountered:
Using spark-solr to write CSV data into solr collection.
Sometimes this CSV might consists of additional fields that they do not want to index to solr collection. Hence we made the solr schema as immutable:
<schemaFactory class="ManagedIndexSchemaFactory"> <bool name="mutable">false</bool> <str name="managedSchemaResourceName">managed-schema</str> </schemaFactory>
This work alright until we have all the columns in CSV defined in Solr(manually). The moment we have some unexpected fields in CSV. Solr-spark tries to create them and the job fails as there is no check if the schema is immutable. As I understand CSV data( all columns) will be loaded as is with
scala> var csvDF= spark.read.format("com.databricks.spark.csv").option("header", "true").load(csvFileLocation)
Feature request: config flag in Solr-spark that says collection schema is immutable.
The text was updated successfully, but these errors were encountered: