Skip to content

Commit

Permalink
[SPARK-38778][INFRA][BUILD] Replace http with https for project url i…
Browse files Browse the repository at this point in the history
…n pom

### What changes were proposed in this pull request?

change <url>http://spark.apache.org/</url> to <url>https://spark.apache.org/</url> in the project URL of all pom files
### Why are the changes needed?

fix home page in maven central https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.13/3.2.1

#### From
License | Apache 2.0
-- | --
Categories |Hadoop Query Engines
HomePage|http://spark.apache.org/
Date | (Jan 26, 2022)

#### to

License | Apache 2.0
-- | --
Categories |Hadoop Query Engines
HomePage|https://spark.apache.org/
Date | (Jan 26, 2022)
### Does this PR introduce _any_ user-facing change?

no

### How was this patch tested?

pass GHA

Closes apache#36053 from yaooqinn/SPARK-38778.

Authored-by: Kent Yao <[email protected]>
Signed-off-by: Yuming Wang <[email protected]>
  • Loading branch information
yaooqinn authored and wangyum committed Apr 4, 2022
1 parent 835b46d commit 65d347d
Show file tree
Hide file tree
Showing 40 changed files with 47 additions and 47 deletions.
2 changes: 1 addition & 1 deletion R/pkg/R/DataFrame.R
Original file line number Diff line number Diff line change
Expand Up @@ -608,7 +608,7 @@ setMethod("cache",
#'
#' Persist this SparkDataFrame with the specified storage level. For details of the
#' supported storage levels, refer to
#' \url{http://spark.apache.org/docs/latest/rdd-programming-guide.html#rdd-persistence}.
#' \url{https://spark.apache.org/docs/latest/rdd-programming-guide.html#rdd-persistence}.
#'
#' @param x the SparkDataFrame to persist.
#' @param newLevel storage level chosen for the persistence. See available options in
Expand Down
2 changes: 1 addition & 1 deletion R/pkg/R/RDD.R
Original file line number Diff line number Diff line change
Expand Up @@ -227,7 +227,7 @@ setMethod("cacheRDD",
#'
#' Persist this RDD with the specified storage level. For details of the
#' supported storage levels, refer to
#'\url{http://spark.apache.org/docs/latest/rdd-programming-guide.html#rdd-persistence}.
#'\url{https://spark.apache.org/docs/latest/rdd-programming-guide.html#rdd-persistence}.
#'
#' @param x The RDD to persist
#' @param newLevel The new storage level to be assigned
Expand Down
6 changes: 3 additions & 3 deletions R/pkg/R/sparkR.R
Original file line number Diff line number Diff line change
Expand Up @@ -344,10 +344,10 @@ sparkRHive.init <- function(jsc = NULL) {
#' the warehouse, an accompanied metastore may also be automatically created in the current
#' directory when a new SparkSession is initialized with \code{enableHiveSupport} set to
#' \code{TRUE}, which is the default. For more details, refer to Hive configuration at
#' \url{http://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables}.
#' \url{https://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables}.
#'
#' For details on how to initialize and use SparkR, refer to SparkR programming guide at
#' \url{http://spark.apache.org/docs/latest/sparkr.html#starting-up-sparksession}.
#' \url{https://spark.apache.org/docs/latest/sparkr.html#starting-up-sparksession}.
#'
#' @param master the Spark master URL.
#' @param appName application name to register with cluster manager.
Expand Down Expand Up @@ -598,7 +598,7 @@ sparkConfToSubmitOps[["spark.kerberos.principal"]] <- "--principal"
#
# A few Spark Application and Runtime environment properties cannot take effect after driver
# JVM has started, as documented in:
# http://spark.apache.org/docs/latest/configuration.html#application-properties
# https://spark.apache.org/docs/latest/configuration.html#application-properties
# When starting SparkR without using spark-submit, for example, from Rstudio, add them to
# spark-submit commandline if not already set in SPARKR_SUBMIT_ARGS so that they can be effective.
getClientModeSparkSubmitOpts <- function(submitOps, sparkEnvirMap) {
Expand Down
2 changes: 1 addition & 1 deletion assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@

<artifactId>spark-assembly_2.12</artifactId>
<name>Spark Project Assembly</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<packaging>pom</packaging>

<properties>
Expand Down
2 changes: 1 addition & 1 deletion common/kvstore/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-kvstore_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Local DB</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>kvstore</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion common/network-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-network-common_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Networking</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>network-common</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion common/network-shuffle/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-network-shuffle_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Shuffle Streaming Service</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>network-shuffle</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion common/network-yarn/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-network-yarn_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project YARN Shuffle Service</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>network-yarn</sbt.project.name>
<!-- Make sure all Hadoop dependencies are provided to avoid repackaging. -->
Expand Down
2 changes: 1 addition & 1 deletion common/sketch/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-sketch_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Sketch</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>sketch</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion common/tags/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-tags_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Tags</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>tags</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion common/unsafe/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-unsafe_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Unsafe</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>unsafe</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion connector/avro/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
</properties>
<packaging>jar</packaging>
<name>Spark Avro</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<dependencies>
<dependency>
Expand Down
2 changes: 1 addition & 1 deletion connector/docker-integration-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-docker-integration-tests_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Docker Integration Tests</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>docker-integration-tests</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion connector/kafka-0-10-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<artifactId>spark-streaming-kafka-0-10-assembly_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Integration for Kafka 0.10 Assembly</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<properties>
<sbt.project.name>streaming-kafka-0-10-assembly</sbt.project.name>
Expand Down
2 changes: 1 addition & 1 deletion connector/kafka-0-10-token-provider/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
</properties>
<packaging>jar</packaging>
<name>Kafka 0.10+ Token Provider for Streaming</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<dependencies>
<dependency>
Expand Down
2 changes: 1 addition & 1 deletion connector/kafka-0-10/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
</properties>
<packaging>jar</packaging>
<name>Spark Integration for Kafka 0.10</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<dependencies>
<dependency>
Expand Down
2 changes: 1 addition & 1 deletion connector/kinesis-asl-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<artifactId>spark-streaming-kinesis-asl-assembly_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Kinesis Assembly</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<properties>
<sbt.project.name>streaming-kinesis-asl-assembly</sbt.project.name>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -73,9 +73,9 @@
* Credential profiles file - default location (~/.aws/credentials) shared by all AWS SDKs
* Instance profile credentials - delivered through the Amazon EC2 metadata service
* For more information, see
* http://docs.aws.amazon.com/AWSSdkDocsJava/latest/DeveloperGuide/credentials.html
* https://docs.aws.amazon.com/AWSSdkDocsJava/latest/DeveloperGuide/credentials.html
*
* See http://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more details on
* See https://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more details on
* the Kinesis Spark Streaming integration.
*/
public final class JavaKinesisWordCountASL { // needs to be public for access from run-example
Expand All @@ -91,7 +91,7 @@ public static void main(String[] args) throws Exception {
" <endpoint-url> is the endpoint of the Kinesis service\n" +
" (e.g. https://kinesis.us-east-1.amazonaws.com)\n" +
"Generate data for the Kinesis stream using the example KinesisWordProducerASL.\n" +
"See http://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more\n" +
"See https://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more\n" +
"details.\n"
);
System.exit(1);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,9 +50,9 @@
Credential profiles file - default location (~/.aws/credentials) shared by all AWS SDKs
Instance profile credentials - delivered through the Amazon EC2 metadata service
For more information, see
http://docs.aws.amazon.com/AWSSdkDocsJava/latest/DeveloperGuide/credentials.html
https://docs.aws.amazon.com/AWSSdkDocsJava/latest/DeveloperGuide/credentials.html
See http://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more details on
See https://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more details on
the Kinesis Spark Streaming integration.
"""
import sys
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ import org.apache.spark.streaming.kinesis.KinesisInputDStream
* For more information, see
* http://docs.aws.amazon.com/AWSSdkDocsJava/latest/DeveloperGuide/credentials.html
*
* See http://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more details on
* See https://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more details on
* the Kinesis Spark Streaming integration.
*/
object KinesisWordCountASL extends Logging {
Expand All @@ -87,7 +87,7 @@ object KinesisWordCountASL extends Logging {
| (e.g. https://kinesis.us-east-1.amazonaws.com)
|
|Generate input data for Kinesis stream using the example KinesisWordProducerASL.
|See http://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more
|See https://spark.apache.org/docs/latest/streaming-kinesis-integration.html for more
|details.
""".stripMargin)
System.exit(1)
Expand Down
2 changes: 1 addition & 1 deletion core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<artifactId>spark-core_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Core</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<properties>
<sbt.project.name>core</sbt.project.name>
Expand Down
2 changes: 1 addition & 1 deletion dev/checkstyle.xml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
with Spark-specific changes from:
http://spark.apache.org/contributing.html#code-style-guide
https://spark.apache.org/contributing.html#code-style-guide
Checkstyle is very configurable. Be sure to read the documentation at
http://checkstyle.sf.net (or in your downloaded distribution).
Expand Down
2 changes: 1 addition & 1 deletion examples/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<artifactId>spark-examples_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Examples</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<properties>
<sbt.project.name>examples</sbt.project.name>
Expand Down
2 changes: 1 addition & 1 deletion graphx/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
</properties>
<packaging>jar</packaging>
<name>Spark Project GraphX</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<dependencies>
<dependency>
Expand Down
2 changes: 1 addition & 1 deletion launcher/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-launcher_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Launcher</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>launcher</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion mllib-local/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
</properties>
<packaging>jar</packaging>
<name>Spark Project ML Local Library</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<dependencies>
<dependency>
Expand Down
2 changes: 1 addition & 1 deletion mllib/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
</properties>
<packaging>jar</packaging>
<name>Spark Project ML Library</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<dependencies>
<dependency>
Expand Down
4 changes: 2 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<version>3.4.0-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spark Project Parent POM</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<licenses>
<license>
<name>Apache 2.0 License</name>
Expand All @@ -50,7 +50,7 @@
<email>[email protected]</email>
<url>https://cs.stanford.edu/people/matei</url>
<organization>Apache Software Foundation</organization>
<organizationUrl>http://spark.apache.org</organizationUrl>
<organizationUrl>https://spark.apache.org</organizationUrl>
</developer>
</developers>
<issueManagement>
Expand Down
2 changes: 1 addition & 1 deletion repl/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
<artifactId>spark-repl_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project REPL</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<properties>
<sbt.project.name>repl</sbt.project.name>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ FROM openjdk:${java_image_tag}
ARG spark_uid=185

# Before building the docker image, first build and make a Spark distribution following
# the instructions in http://spark.apache.org/docs/latest/building-spark.html.
# the instructions in https://spark.apache.org/docs/latest/building-spark.html.
# If this docker file is being used in the context of building your images from a Spark
# distribution, the docker build command should be invoked from the top level directory
# of the Spark distribution. E.g.:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ FROM debian:bullseye-slim
ARG spark_uid=185

# Before building the docker image, first build and make a Spark distribution following
# the instructions in http://spark.apache.org/docs/latest/building-spark.html.
# the instructions in https://spark.apache.org/docs/latest/building-spark.html.
# If this docker file is being used in the context of building your images from a Spark
# distribution, the docker build command should be invoked from the top level directory
# of the Spark distribution. E.g.:
Expand Down
2 changes: 1 addition & 1 deletion sql/catalyst/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-catalyst_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Catalyst</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>catalyst</sbt.project.name>
</properties>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -495,7 +495,7 @@ object QueryExecutionErrors {
new ClassNotFoundException(
s"""
|Failed to find data source: $provider. Please find packages at
|http://spark.apache.org/third-party-projects.html
|https://spark.apache.org/third-party-projects.html
""".stripMargin, error)
}

Expand Down
2 changes: 1 addition & 1 deletion sql/core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-sql_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project SQL</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>sql</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion sql/hive-thriftserver/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-hive-thriftserver_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Hive Thrift Server</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>hive-thriftserver</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion sql/hive/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<artifactId>spark-hive_2.12</artifactId>
<packaging>jar</packaging>
<name>Spark Project Hive</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>
<properties>
<sbt.project.name>hive</sbt.project.name>
</properties>
Expand Down
2 changes: 1 addition & 1 deletion streaming/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
</properties>
<packaging>jar</packaging>
<name>Spark Project Streaming</name>
<url>http://spark.apache.org/</url>
<url>https://spark.apache.org/</url>

<dependencies>
<dependency>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -275,7 +275,7 @@ class StreamingContext private[streaming] (

/**
* Create an input stream with any arbitrary user implemented receiver.
* Find more details at http://spark.apache.org/docs/latest/streaming-custom-receivers.html
* Find more details at https://spark.apache.org/docs/latest/streaming-custom-receivers.html
* @param receiver Custom implementation of Receiver
*/
def receiverStream[T: ClassTag](receiver: Receiver[T]): ReceiverInputDStream[T] = {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -421,7 +421,7 @@ class JavaStreamingContext(val ssc: StreamingContext) extends Closeable {

/**
* Create an input stream with any arbitrary user implemented receiver.
* Find more details at: http://spark.apache.org/docs/latest/streaming-custom-receivers.html
* Find more details at: https://spark.apache.org/docs/latest/streaming-custom-receivers.html
* @param receiver Custom implementation of Receiver
*/
def receiverStream[T](receiver: Receiver[T]): JavaReceiverInputDStream[T] = {
Expand Down
Loading

0 comments on commit 65d347d

Please sign in to comment.