Spark Unable to load native-hadoop library for your platform

Spark can throw an error when trying to load the local configuration. Adding spark.master as local fixes the issue.

SparkSession spark = SparkSession
   .builder()
   .appName("JavaSparkPi")
   .config("spark.master", "local")
   .getOrCreate();

The full error generated by spark is the following:

Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
17/11/02 10:41:24 INFO SparkContext: Running Spark version 2.2.0
17/11/02 10:41:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/11/02 10:41:25 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: A master URL must be set in your configuration
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:376)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
	at org.apache.spark.examples.JavaSparkPi.main(JavaSparkPi.java:43)
17/11/02 10:41:25 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:376)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
	at org.apache.spark.examples.JavaSparkPi.main(JavaSparkPi.java:43)

References:

Java 8
Apache Spark