Description
I created a new data type extended from Point. When trying to create an index I got the following error:
~/tempdir$ shadoop index -libjars spatial-types.jar antenas_santiago_por_torre antennas.grid mbr:-100,-100,100,100 sindex:rtree shape:Antenna 16/04/07 15:26:52 INFO core.SpatialSite: Adding JAR 'spatial-types.jar' to job class path 16/04/07 15:26:53 INFO mapreduce.SpatialInputFormat3: No block filter specified 16/04/07 15:26:53 INFO input.FileInputFormat: Total input paths to process : 1 16/04/07 15:26:53 INFO spatialHadoop.OperationsParams: Input size is small enough to use local machine 16/04/07 15:26:53 INFO indexing.Indexer: Reading a sample of 1% 16/04/07 15:26:53 INFO client.RMProxy: Connecting to ResourceManager at HP-Z420-Workstation/192.168.6.11:8032 16/04/07 15:26:53 INFO client.RMProxy: Connecting to ResourceManager at HP-Z420-Workstation/192.168.6.11:8032 16/04/07 15:26:53 INFO client.RMProxy: Connecting to ResourceManager at HP-Z420-Workstation/192.168.6.11:8032 16/04/07 15:26:53 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 16/04/07 15:26:53 INFO mapred.FileInputFormat: No block filter specified 16/04/07 15:26:53 INFO mapred.FileInputFormat: Total input paths to process : 1 16/04/07 15:26:53 INFO mapreduce.JobSubmitter: number of splits:1 16/04/07 15:26:54 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1459964025423_0061 16/04/07 15:26:54 INFO impl.YarnClientImpl: Submitted application application_1459964025423_0061 16/04/07 15:26:54 INFO mapreduce.Job: The url to track the job: http://HP-Z420-Workstation:8088/proxy/application_1459964025423_0061/ 16/04/07 15:26:54 INFO mapreduce.Job: Running job: job_1459964025423_0061 16/04/07 15:26:59 INFO mapreduce.Job: Job job_1459964025423_0061 running in uber mode : false 16/04/07 15:26:59 INFO mapreduce.Job: map 0% reduce 0% 16/04/07 15:27:04 INFO mapreduce.Job: Task Id : attempt_1459964025423_0061_m_000000_0, Status : FAILED Error: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:449) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1707) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) ... 9 more Caused by: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38) ... 14 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) ... 17 more Caused by: java.lang.NullPointerException at edu.umn.cs.spatialHadoop.operations.Sampler$Map.configure(Sampler.java:100) ... 22 more
The problem was fixed when I renamed the custom class to 'PointAntenna' instead of 'Antenna'