diff --git a/31/connect.html b/31/connect.html index 07f8778f0..f1b4da155 100644 --- a/31/connect.html +++ b/31/connect.html @@ -49,6 +49,7 @@

Running Kafka Connectbootstrap.servers - List of Kafka servers used to bootstrap connections to Kafka
  • key.converter - Converter class used to convert between Kafka Connect format and the serialized form that is written to Kafka. This controls the format of the keys in messages written to or read from Kafka, and since this is independent of connectors it allows any connector to work with any serialization format. Examples of common formats include JSON and Avro.
  • value.converter - Converter class used to convert between Kafka Connect format and the serialized form that is written to Kafka. This controls the format of the values in messages written to or read from Kafka, and since this is independent of connectors it allows any connector to work with any serialization format. Examples of common formats include JSON and Avro.
  • +
  • plugin.path (default empty) - a list of paths that contain Connect plugins (connectors, converters, transformations). Before running quick starts, users must add the absolute path that contains the example FileStreamSourceConnector and FileStreamSinkConnector packaged in connect-file-"version".jar, because these connectors are not included by default to the CLASSPATH or the plugin.path of the Connect worker (see plugin.path property for examples).
  • The important configuration options specific to standalone mode are:

    diff --git a/31/generated/admin_client_config.html b/31/generated/admin_client_config.html index c85bad7fe..9dd1206bf 100644 --- a/31/generated/admin_client_config.html +++ b/31/generated/admin_client_config.html @@ -274,7 +274,7 @@

    kafka.connect:type=connect-worker-metrics diff --git a/31/generated/consumer_config.html b/31/generated/consumer_config.html index e58656cf7..696e771e6 100644 --- a/31/generated/consumer_config.html +++ b/31/generated/consumer_config.html @@ -251,7 +251,7 @@

  • isolation.level

    -

    Controls how to read messages written transactionally. If set to read_committed, consumer.poll() will only return transactional messages which have been committed. If set to read_uncommitted (the default), consumer.poll() will return all messages, even transactional messages which have been aborted. Non-transactional messages will be returned unconditionally in either mode.

    Messages will always be returned in offset order. Hence, in read_committed mode, consumer.poll() will only return messages up to the last stable offset (LSO), which is the one less than the offset of the first open transaction. In particular any messages appearing after messages belonging to ongoing transactions will be withheld until the relevant transaction has been completed. As a result, read_committed consumers will not be able to read up to the high watermark when there are in flight transactions.

    Further, when in read_committed the seekToEnd method will return the LSO

    +

    Controls how to read messages written transactionally. If set to read_committed, consumer.poll() will only return transactional messages which have been committed. If set to read_uncommitted (the default), consumer.poll() will return all messages, even transactional messages which have been aborted. Non-transactional messages will be returned unconditionally in either mode.

    Messages will always be returned in offset order. Hence, in read_committed mode, consumer.poll() will only return messages up to the last stable offset (LSO), which is the one less than the offset of the first open transaction. In particular any messages appearing after messages belonging to ongoing transactions will be withheld until the relevant transaction has been completed. As a result, read_committed consumers will not be able to read up to the high watermark when there are in flight transactions.

    Further, when in read_committed the seekToEnd method will return the LSO

    @@ -434,7 +434,7 @@

    retries

    -

    Setting a value greater than zero will cause the client to resend any record whose send fails with a potentially transient error. Note that this retry is no different than if the client resent the record upon receiving the error. Allowing retries without setting max.in.flight.requests.per.connection to 1 will potentially change the ordering of records because if two batches are sent to a single partition, and the first fails and is retried but the second succeeds, then the records in the second batch may appear first. Note additionally that produce requests will be failed before the number of retries has been exhausted if the timeout configured by delivery.timeout.ms expires first before successful acknowledgement. Users should generally prefer to leave this config unset and instead use delivery.timeout.ms to control retry behavior.

    +

    Setting a value greater than zero will cause the client to resend any record whose send fails with a potentially transient error. Note that this retry is no different than if the client resent the record upon receiving the error. Produce requests will be failed before the number of retries has been exhausted if the timeout configured by delivery.timeout.ms expires first before successful acknowledgement. Users should generally prefer to leave this config unset and instead use delivery.timeout.ms to control retry behavior.

    Enabling idempotence requires this config value to be greater than 0. If conflicting configurations are set and idempotence is not explicitly enabled, idempotence is disabled.

    Allowing retries while setting enable.idempotence to false and max.in.flight.requests.per.connection to 1 will potentially change the ordering of records because if two batches are sent to a single partition, and the first fails and is retried but the second succeeds, then the records in the second batch may appear first.

    Type:string
    Default:read_uncommitted
    @@ -374,7 +374,7 @@

  • acks

    -

    The number of acknowledgments the producer requires the leader to have received before considering a request complete. This controls the durability of records that are sent. The following settings are allowed:

    • acks=0 If set to zero then the producer will not wait for any acknowledgment from the server at all. The record will be immediately added to the socket buffer and considered sent. No guarantee can be made that the server has received the record in this case, and the retries configuration will not take effect (as the client won't generally know of any failures). The offset given back for each record will always be set to -1.
    • acks=1 This will mean the leader will write the record to its local log but will respond without awaiting full acknowledgement from all followers. In this case should the leader fail immediately after acknowledging the record but before the followers have replicated it then the record will be lost.
    • acks=all This means the leader will wait for the full set of in-sync replicas to acknowledge the record. This guarantees that the record will not be lost as long as at least one in-sync replica remains alive. This is the strongest available guarantee. This is equivalent to the acks=-1 setting.

    +

    The number of acknowledgments the producer requires the leader to have received before considering a request complete. This controls the durability of records that are sent. The following settings are allowed:

    • acks=0 If set to zero then the producer will not wait for any acknowledgment from the server at all. The record will be immediately added to the socket buffer and considered sent. No guarantee can be made that the server has received the record in this case, and the retries configuration will not take effect (as the client won't generally know of any failures). The offset given back for each record will always be set to -1.
    • acks=1 This will mean the leader will write the record to its local log but will respond without awaiting full acknowledgement from all followers. In this case should the leader fail immediately after acknowledging the record but before the followers have replicated it then the record will be lost.
    • acks=all This means the leader will wait for the full set of in-sync replicas to acknowledge the record. This guarantees that the record will not be lost as long as at least one in-sync replica remains alive. This is the strongest available guarantee. This is equivalent to the acks=-1 setting.

    Note that enabling idempotence requires this config value to be 'all'. If conflicting configurations are set and idempotence is not explicitly enabled, idempotence is disabled.

  • Type:int
    Default:2147483647
    @@ -431,7 +431,7 @@

    a
  • enable.idempotence

    -

    When set to 'true', the producer will ensure that exactly one copy of each message is written in the stream. If 'false', producer retries due to broker failures, etc., may write duplicates of the retried message in the stream. Note that enabling idempotence requires max.in.flight.requests.per.connection to be less than or equal to 5 (with message ordering preserved for any allowable value), retries to be greater than 0, and acks must be 'all'. If these values are not explicitly set by the user, suitable values will be chosen. If incompatible values are set, a ConfigException will be thrown.

    +

    When set to 'true', the producer will ensure that exactly one copy of each message is written in the stream. If 'false', producer retries due to broker failures, etc., may write duplicates of the retried message in the stream. Note that enabling idempotence requires max.in.flight.requests.per.connection to be less than or equal to 5 (with message ordering preserved for any allowable value), retries to be greater than 0, and acks must be 'all'.

    Idempotence is enabled by default if no conflicting configurations are set. If conflicting configurations are set and idempotence is not explicitly enabled, idempotence is disabled. If idempotence is explicitly enabled and conflicting configurations are set, a ConfigException is thrown.

  • Type:string
    Default:all
    @@ -451,7 +451,7 @@

  • max.in.flight.requests.per.connection

    -

    The maximum number of unacknowledged requests the client will send on a single connection before blocking. Note that if this config is set to be greater than 1 and enable.idempotence is set to false, there is a risk of message re-ordering after a failed send due to retries (i.e., if retries are enabled).

    +

    The maximum number of unacknowledged requests the client will send on a single connection before blocking. Note that if this config is set to be greater than 1 and enable.idempotence is set to false, there is a risk of message re-ordering after a failed send due to retries (i.e., if retries are enabled). Additionally, enabling idempotence requires this config value to be less than or equal to 5. If conflicting configurations are set and idempotence is not explicitly enabled, idempotence is disabled.

  • Type:boolean
    Default:true
    diff --git a/31/generated/sink_connector_config.html b/31/generated/sink_connector_config.html index 8c0f924c6..8c852aac9 100644 --- a/31/generated/sink_connector_config.html +++ b/31/generated/sink_connector_config.html @@ -151,7 +151,7 @@

  • errors.log.include.messages

    -

    Whether to the include in the log the Connect record that resulted in a failure. This is 'false' by default, which will prevent record keys, values, and headers from being written to log files, although some information such as topic and partition number will still be logged.

    +

    Whether to include in the log the Connect record that resulted in a failure.For sink records, the topic, partition, offset, and timestamp will be logged. For source records, the key and value (and their schemas), all headers, and the timestamp, Kafka topic, Kafka partition, source partition, and source offset will be logged. This is 'false' by default, which will prevent record keys, values, and headers from being written to log files.

  • Type:int
    Default:5
    diff --git a/31/generated/source_connector_config.html b/31/generated/source_connector_config.html index 087b22d0f..9dccd63cf 100644 --- a/31/generated/source_connector_config.html +++ b/31/generated/source_connector_config.html @@ -131,7 +131,7 @@

    errors.log.include.messages

    -

    Whether to the include in the log the Connect record that resulted in a failure. This is 'false' by default, which will prevent record keys, values, and headers from being written to log files, although some information such as topic and partition number will still be logged.

    +

    Whether to include in the log the Connect record that resulted in a failure.For sink records, the topic, partition, offset, and timestamp will be logged. For source records, the key and value (and their schemas), all headers, and the timestamp, Kafka topic, Kafka partition, source partition, and source offset will be logged. This is 'false' by default, which will prevent record keys, values, and headers from being written to log files.

    Type:boolean
    Default:false
    diff --git a/31/generated/streams_config.html b/31/generated/streams_config.html index 1e1330676..b23ea683d 100644 --- a/31/generated/streams_config.html +++ b/31/generated/streams_config.html @@ -34,7 +34,7 @@

    -All Classes and Interfaces (kafka 3.1.0 API) +All Classes and Interfaces (kafka 3.1.1 API) @@ -90,6 +90,21 @@

    All Classes and Interfaces<
    Represents a filter which matches access control entries.
    +
    +
    +
    An AccessTokenRetriever is the internal API by which the login module will + retrieve an access token for use in authorization by the broker.
    +
    + +
     
    + +
    +
    An instance of AccessTokenValidator acts as a function object that, given an access + token in base-64 encoded JWT format, can parse the data, perform validation, and construct an + OAuthBearerToken for use by the caller.
    +
    + +
     
    Represents a binding between a resource pattern and an access control entry.
    @@ -238,59 +253,69 @@

    All Classes and Interfaces<
    A SampledStat that maintains a simple average over its samples.
    - -
    + +
    +
    An implementation of the OAuthBearerToken that fairly straightforwardly stores the values + given to its constructor (except the scope set which is copied to avoid modifications).
    +
    + +
    Interface for batching restoration of a StateStore It is expected that implementations of this class will not call the StateRestoreCallback.restore(byte[], byte[]) method.
    - -
    + +
    The Branched class is used to define the optional parameters when building branches with BranchedKStream.
    - -
    + +
    Branches the records in the original stream based on the predicates supplied for the branch definitions.
    - -
     
    - +
     
    - -
    + +
     
    + +
    Indicates that none of the specified brokers could be found.
    - -
    + +
    This exception is thrown if the producer cannot allocate memory for a record within max.block.ms due to the buffer being too full.
    - -
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - -
    + +
     
    + +
    A callback interface that the user can implement to allow code to execute when the request is complete.
    - -
    + + - -
    + +
    Checkpoint records emitted from MirrorCheckpointConnector.
    + +
    +
    Simple utility class to perform basic cleaning and validation on input values so that they're + performed consistently throughout the code base.
    +
    Describes a configuration alteration to be made to a client quota entity.
    @@ -329,143 +354,153 @@

    All Classes and Interfaces<
    Types of quotas that may be configured on brokers for client requests.
    - -
    + +
    +
    The OAuthBearerValidatorCallbackHandler uses a VerificationKeyResolver as + part of its validation of the incoming JWT.
    +
    + +
    An immutable representation of a subset of the nodes, topics, and partitions in the Kafka cluster.
    - -
     
    - -
    + +
     
    + +
    The ClusterResource class encapsulates metadata for a Kafka cluster.
    - -
    + +
    A callback interface that users can implement when they wish to get notified about changes in the Cluster metadata.
    - -
    + +
    CogroupedKStream is an abstraction of multiple grouped record streams of KeyValue pairs.
    - -
    + +
    This exception is raised when an offset commit with KafkaConsumer.commitSync() fails with an unrecoverable error.
    - -
    + +
    A compound stat is a stat where a single measurement and associated data structure feeds many metrics.
    - -
     
    - -
     
    - -
    + +
     
    + +
     
    + +
    A configuration object containing the configuration entries for a resource.
    - -
     
    - -
    + +
     
    + +
    A callback passed to ConfigProvider for subscribing to changes.
    - -
    + +
    Configuration data from a ConfigProvider.
    - -
    + +
    This class is used for specifying the set of expected configurations.
    - -
     
    - +
     
    - +
     
    - -
    + +
     
    + +
    The importance level for a configuration
    - -
     
    - +
     
    - +
     
    - +
     
    - -
    + +
     
    + +
    Validation logic for numeric ranges
    - -
    + +
    This is used by the ConfigDef.validate(Map) to get valid values for a configuration given the current configuration values in order to perform full configuration validation and visibility modification.
    - -
    + +
    The config types
    - -
    + +
    Validation logic the user may provide to perform single configuration validation.
    - -
     
    - +
     
    - -
    + +
     
    + +
    The width of a configuration value
    - -
    + +
    A class representing a configuration entry containing name, value and additional metadata.
    - -
    + +
    Source of configuration entries.
    - -
    + +
    Class representing a configuration synonym of a ConfigEntry.
    - -
    + +
    Data type of configuration entry.
    - -
    + +
    Thrown if the user supplies an invalid configuration
    - -
    + +
    A provider of configuration data, which may optionally support subscriptions to configuration changes.
    - -
    + +
    A class representing resources that have configs.
    - -
    + +
    Type of resource.
    - -
    + +
    This class wraps a set of ConfigProvider instances and uses them to perform transformations.
    - -
    + +
    The result of a transformation from ConfigTransformer.
    - -
    + +
    A Mix-in style interface for classes that are instantiated by reflection and need to take configuration parameters
    + +
    +
    ConfigurationUtils is a utility class to perform basic configuration-related + logic and is separated out here for easier, more direct testing.
    +
     
    @@ -992,98 +1027,111 @@

    All Classes and Interfaces<
    An implementation of ConfigProvider that represents a Properties file.
    - +
    +
    FileTokenRetriever is an AccessTokenRetriever that will load the contents, + interpreting them as a JWT access key in the serialized form.
    +
    + +
    Represents a range of version levels supported by every broker in a cluster for some feature.
    - -
     
    - +
     
    - -
    + +
     
    + +
    The ForeachAction interface for performing an action on a key-value pair.
    - -
     
    - -
    + +
     
    + +
    A CompoundStat that represents a normalized distribution with a Frequency metric for each bucketed value.
    - -
    + +
    Definition of a frequency metric used in a Frequencies compound statistic.
    - -
    + +
    A gauge metric is an instantaneous reading of a particular value.
    - -
    + +
    GlobalKTable is an abstraction of a changelog stream from a primary-keyed table.
    - -
     
    - -
    + +
     
    + +
    The class that is used to capture the key and value Serdes and set the part of name used for repartition topics when performing KStream.groupBy(KeyValueMapper, Grouped), KStream.groupByKey(Grouped), or KTable.groupBy(KeyValueMapper, Grouped) operations.
    - -
     
    - -
    + +
     
    + +
    Indicates that a consumer group is already at its configured maximum capacity and cannot accommodate more members
    - -
     
    - +
     
    - -
     
    - -
    + +
     
    + +
     
    + +
    A Header is a key-value pair, and multiple headers can be included with the key, value, and timestamp in each Kafka message.
    - -
     
    - +
     
    - -
    + +
     
    + +
    A mutable ordered collection of Header objects.
    - -
    + +
    A function to transform the supplied Header.
    - -
    + +
    Heartbeat message sent from MirrorHeartbeatTask to target cluster.
    - -
     
    - -
    + +
     
    + +
    An algorithm for determining the bin in which a value is to be placed as well as calculating the upper end of each bin.
    - -
    + +
    A scheme for calculating the bins where the width of each bin is a constant determined by the range of values and the number of bins.
    - -
    + +
    A scheme for calculating the bins where the width of each bin is one more than the previous bin, and therefore the bin widths are increasing at a linear rate.
    - -
    + +
    Represents a user defined endpoint in a KafkaStreams application.
    + +
    +
    HttpAccessTokenRetriever is an AccessTokenRetriever that will + communicate with an OAuth/OIDC provider directly via HTTP to post client credentials + (OAuthBearerLoginCallbackHandler.CLIENT_ID_CONFIG/OAuthBearerLoginCallbackHandler.CLIENT_SECRET_CONFIG) + to a publicized token endpoint URL + (SaslConfigs.SASL_OAUTHBEARER_TOKEN_ENDPOINT_URL).
    +
    IdentityReplicationPolicy does not rename remote topics.
    @@ -1106,112 +1154,119 @@

    All Classes and Interfaces<
     
     
    - -
    + +
     
    + +
    The Initializer interface for creating an initial value in aggregations.
    - -
     
    - +
     
    - -
    + +
     
    + +
    Annotation to inform users of how much to rely on a particular package, class or method not changing over time.
    - -
    + +
    Compatibility may be broken at minor release (i.e.
    - -
    + +
    Compatibility is maintained in major, minor and patch releases with one exception: compatibility may be broken in a major release (i.e.
    - -
    + +
    No guarantee is provided as to reliability or stability across any level of release granularity.
    - -
    + +
    An unchecked wrapper for InterruptedException
    - -
     
    - +
     
    - +
     
    - +
     
    - +
     
    - -
    + +
     
    + +
    An exception that may indicate the client's metadata is out of date
    - -
    + +
    Thrown when the offset for a set of partitions is invalid (either undefined or out of range), and no reset policy has been configured.
    - -
    + +
    Thrown when the offset for a set of partitions is invalid (either undefined or out of range), and no reset policy has been configured.
    - -
     
    - +
     
    - +
     
    - -
    + +
     
    + +
    This exception indicates that the produce request sent to the partition leader contains a non-matching producer epoch.
    - -
     
    - +
     
    - +
     
    - -
    + +
     
    + +
    Thrown when a request breaks basic wire protocol rules.
    - -
     
    - +
     
    - -
    + +
     
    + +
    Indicates that there was a problem when trying to access a StateStore.
    - -
    + +
    Indicates that the specific state store being queried via StoreQueryParameters used a partitioning that is not assigned to this instance.
    - -
    + +
    Indicate the timestamp of a record is invalid.
    - -
    + +
    The client has attempted to perform an operation on an invalid topic.
    - -
     
    - -
    + +
     
    + +
    The transaction coordinator returns this error code if the timeout received via the InitProducerIdRequest is larger than the `transaction.max.timeout.ms` config value.
    - -
     
    - -
     
    + +
     
    + +
     
    + +
    +
    JaasOptionsUtils is a utility class to perform logic for the JAAS options and + is separated out here for easier, more direct testing.
    +
    Register metrics in JMX as dynamic mbeans based on the metric names
    @@ -1226,219 +1281,231 @@

    All Classes and Interfaces<
    The window specifications used for joins.
    - +
    +
    JwksFileVerificationKeyResolver is a VerificationKeyResolver implementation + that will load the JWKS from the given file system directory.
    +
    + +
    The default implementation of Admin.
    - -
    + +
    KafkaClientSupplier can be used to provide custom Kafka clients to a KafkaStreams instance.
    - -
    + +
    A client that consumes records from a Kafka cluster.
    - -
    + +
    The base class of all other Kafka exceptions
    - -
    + +
    A flexible future which supports call chaining and other asynchronous programming patterns.
    - -
    + +
    A function which takes objects of type A and returns objects of type B.
    - -
    + +
    A consumer of two different types of object.
    - -
    Deprecated. + +
    Deprecated.
    Since Kafka 3.0.
    - -
     
    - -
    + +
     
    + +
    A implementation of MetricsContext, it encapsulates required metrics context properties for Kafka services and clients
    - -
    + +
    Principals in Kafka are defined by a type and a name.
    - -
    + +
    Pluggable principal builder interface which supports both SSL authentication through SslAuthenticationContext and SASL through SaslAuthenticationContext.
    - -
    + +
    Serializer/Deserializer interface for KafkaPrincipal for the purpose of inter-broker forwarding.
    - -
    + +
    A Kafka client that publishes records to the Kafka cluster.
    - -
    + +
    Miscellaneous disk-related IOException occurred when handling a request.
    - -
    + +
    A Kafka client that allows for performing continuous computation on input coming from one or more input topics and sends output to zero, one, or more output topics.
    - -
    + +
    Kafka Streams states are the possible state that a Kafka Streams instance can be in.
    - -
    + +
    Listen to KafkaStreams.State change events.
    - -
    + +
    Represents all the metadata related to a key, where a particular key resides in a KafkaStreams application.
    - -
    + +
    A key-value pair defined for a single Kafka Streams record.
    - -
    + +
    A store supplier that can be used to create one or more KeyValueStore<Bytes, byte[]> instances of type <Bytes, byte[]>.
    - -
    + +
    Iterator interface of KeyValue.
    - -
    + +
    The KeyValueMapper interface for mapping a key-value pair to a new value of arbitrary type.
    - -
    + +
    A key-value store that supports put/get/delete and range queries.
    - -
    + +
    KGroupedStream is an abstraction of a grouped record stream of KeyValue pairs.
    - -
    + +
    KGroupedTable is an abstraction of a re-grouped changelog stream from a primary-keyed table, usually on a different grouping key than the original primary key.
    - -
    + +
    KStream is an abstraction of a record stream of KeyValue pairs, i.e., each record is an independent entity/event in the real world.
    - -
    + +
    KTable is an abstraction of a changelog stream from a primary-keyed table.
    - -
    + +
    Encapsulates information about lag, at a store partition replica (active or standby).
    - -
    + +
    There is no currently available leader for the given partition (either because a leadership election is in progress or because all replicas are down).
    - -
    + + - -
    + +
    - -
    + + - -
    + +
    The result of the Admin.listConsumerGroups() call.
    - -
     
    - -
    + +
     
    + +
    The leader does not have an endpoint corresponding to the listener on which metadata was requested.
    - -
    + +
    - -
    + +
    The result of the Admin.listOffsets(Map) call.
    - -
     
    - -
    + +
     
    + +
    Options for Admin.listPartitionReassignments(ListPartitionReassignmentsOptions) The API of this class is evolving.
    - -
    + + - -
     
    - -
    + +
     
    + +
    Options for Admin.listTopics().
    - -
    + +
    The result of the Admin.listTopics() call.
    - -
    + + - -
    + +
    The result of the Admin.listTransactions() call.
    - -
    + +
    Indicates that the state store directory lock could not be acquired because another thread holds the lock.
    - -
    + +
    Deserialization handler that logs a deserialization exception and then signals the processing pipeline to continue processing more records.
    - -
    + +
    Deserialization handler that logs a deserialization exception and then signals the processing pipeline to stop processing more records and fail.
    - -
    + +
    Retrieves embedded metadata timestamps from Kafka messages.
    - -
    + +
    A description of a log directory on a particular broker.
    - -
    + +
    Thrown when a request is made for a log directory that is not present on the broker
    - -
    + +
    Login interface for authentication.
    + +
    +
    LoginAccessTokenValidator is an implementation of AccessTokenValidator that is used + by the client to perform some rudimentary validation of the JWT access token that is received + as part of the response from posting the client credentials to the OAuth/OIDC provider's + token endpoint.
    +
    This class holds definitions for log level configurations related to Kafka's application logging.
    @@ -1646,28 +1713,43 @@

    All Classes and Interfaces< Callback handlers should use the OAuthBearerExtensionsValidatorCallback.valid(String) method to communicate valid extensions back to the SASL server.

    - +
    +
    + OAuthBearerLoginCallbackHandler is an AuthenticateCallbackHandler that + accepts OAuthBearerTokenCallback and SaslExtensionsCallback callbacks to + perform the steps to request a JWT from an OAuth/OIDC provider using the + clientcredentials.
    +
    + +
    The LoginModule for the SASL/OAUTHBEARER mechanism.
    - -
    + +
    The b64token value as defined in RFC 6750 Section 2.1 along with the token's specific scope and lifetime and principal name.
    - -
    + +
    A Callback for use by the SaslClient and Login implementations when they require an OAuth 2 bearer token.
    - -
    + +
    A Callback for use by the SaslServer implementation when it needs to provide an OAuth 2 bearer token compact serialization for validation.
    + +
    +
    + OAuthBearerValidatorCallbackHandler is an AuthenticateCallbackHandler that + accepts OAuthBearerValidatorCallback and OAuthBearerExtensionsValidatorCallback + callbacks to implement OAuth/OIDC validation.
    +
    The Kafka offset commit API allows users to provide additional metadata (in the form of a string) @@ -1948,6 +2030,17 @@

    All Classes and Interfaces<
    The Reducer interface for combining two values of the same type into a new value.
    + +
    +
    Implementation of HttpsJwks that will periodically refresh the JWKS cache to reduce or + even prevent HTTP/HTTPS traffic in the hot path of validation.
    +
    + +
    +
    RefreshingHttpsJwksVerificationKeyResolver is a + VerificationKeyResolver implementation that will periodically refresh the + JWKS using its HttpsJwks instance.
    +
    Convenience methods for multi-cluster environments.
    @@ -2065,6 +2158,15 @@

    All Classes and Interfaces<
    An exception that indicates the operation can be reattempted.
    + +
    +
    Retry encapsulates the mechanism to perform a retry and then exponential + backoff using provided wait times between attempts.
    +
    + +
    +
    Simple interface to abstract out the call that is made so that it can be retried.
    +
    An interface to that allows developers to customize the RocksDB settings for a given Store.
    @@ -2211,578 +2313,585 @@

    All Classes and Interfaces<
    Any exception during serialization in the producer
    - -
    -
    An interface for converting objects to bytes.
    + +
    +
    SerializedJwt provides a modicum of structure and validation around a JWT's serialized form by + splitting and making the three sections (header, payload, and signature) available to the user.
    - +
    -
    A store supplier that can be used to create one or more SessionStore<Byte, byte[]> instances.
    +
    An interface for converting objects to bytes.
    - +
    -
    Interface for storing the aggregated values of sessions.
    +
    A store supplier that can be used to create one or more SessionStore<Byte, byte[]> instances.
    - +
    +
    Interface for storing the aggregated values of sessions.
    +
    + +
    SessionWindowedCogroupKStream is an abstraction of a windowed record stream of KeyValue pairs.
    - -
     
    - -
    + +
     
    + +
    SessionWindowedKStream is an abstraction of a windowed record stream of KeyValue pairs.
    - -
     
    - -
    + +
     
    + +
    A session based window specification used for aggregating events into sessions.
    - -
     
    - +
     
    - -
    + +
     
    + +
    A HeaderConverter that serializes header values as strings and that deserializes header values to the most appropriate numeric, boolean, array, or map representation.
    - -
    + +
    A simple rate the rate is incrementally calculated based on the elapsed time between the earliest reading and now.
    - -
    + +
    SinkConnectors implement the Connector interface to send Kafka data to another system.
    - -
    + +
    A context to allow a SinkConnector to interact with the Kafka Connect runtime.
    - -
    + +
    SinkRecord is a ConnectRecord that has been read from Kafka and includes the kafkaOffset of the record in the Kafka topic-partition in addition to the standard fields.
    - -
    + +
    SinkTask is a Task that takes records loaded from Kafka and sends them to another system.
    - -
    + +
    Context passed to SinkTasks, allowing them to access utilities in the Kafka Connect runtime.
    - -
    + +
    A sliding window used for aggregating events.
    - -
     
    - -
    + +
     
    + +
    Directional pair of clusters, where source is replicated to target.
    - -
    + +
    SourceConnectors implement the connector interface to pull data from another system and send it to Kafka.
    - -
    + +
    A context to allow a SourceConnector to interact with the Kafka Connect runtime.
    - -
    + +
    SourceRecords are generated by SourceTasks and passed to Kafka Connect for storage in Kafka.
    - -
    + +
    SourceTask is a Task that pulls records from another system for storage in Kafka.
    - -
    + +
    SourceTaskContext is provided to SourceTasks to allow them to interact with the underlying runtime.
    - -
     
    - -
    + +
     
    + +
    This exception indicates that SSL handshake has failed.
    - -
    + +
    Describes whether the server should require or request client authentication.
    - -
     
    - -
    + +
     
    + +
    Plugin interface for allowing creation of SSLEngine object in a custom way.
    - -
     
    - -
    + +
     
    + +
    A Stat is a quantity such as average, max, etc that is computed off the stream of updates to a sensor
    - -
    + +
    Restoration logic for log-backed state stores upon restart, it takes one record at a time from the logs to apply to the restoring state.
    - -
    + +
    Class for listening to various states of the restoration process of a StateStore.
    - -
    + +
    Factory for creating serializers / deserializers for state stores in Kafka Streams.
    - -
    + +
    A storage engine for managing state maintained by a stream processor.
    - -
    + +
    State store context interface.
    - -
    + +
    Indicates that the state store being queried is closed although the Kafka Streams state is RUNNING or REBALANCING.
    - -
    + +
    Indicates that the state store being queried is already closed.
    - -
    + +
    The sticky assignor serves two purposes.
    - -
    + +
    Build a StateStore wrapped with optional caching and logging.
    - -
    + +
    StoreQueryParameters allows you to pass a variety of parameters when fetching a store for interactive query.
    - -
    + +
    Factory for creating state stores in Kafka Streams.
    - -
    + +
    A state store supplier which can create one or more StateStore instances.
    - -
    + +
    Class used to configure the name of the join processor, the repartition topic name, state stores or state store names in Stream-Stream join.
    - -
    + +
    Determine how records are distributed among the partitions in a Kafka topic.
    - -
    + +
    StreamsBuilder provide the high-level Kafka Streams DSL to specify a Kafka Streams topology.
    - -
    + +
    Configuration for a KafkaStreams instance.
    - -
     
    - -
    + +
     
    + +
    StreamsException is the top-level exception type generated by Kafka Streams, and indicates errors have occurred during a StreamThread's processing.
    - -
    Deprecated. + +
    Deprecated.
    since 3.0.0 use StreamsMetadata
    - -
    + +
    Metadata of a Kafka Streams client.
    - -
    + +
    The Kafka Streams metrics interface for adding metric sensors and collecting metric values.
    - -
    + +
    Indicates that Kafka Streams is in state CREATED and thus state stores cannot be queries yet.
    - -
    + +
    Indicates that Kafka Streams is in state REBALANCING and thus cannot be queried by default.
    - -
     
    - -
    + +
     
    + +
    Enumeration that describes the response from the exception handler.
    - -
    + +
    Converter and HeaderConverter implementation that only supports serializing to strings.
    - -
    + +
    Configuration options for StringConverter instances.
    - -
    + +
    String encoding defaults to UTF8 and can be customized by setting the property key.deserializer.encoding, value.deserializer.encoding or deserializer.encoding.
    - -
    + +
    String encoding defaults to UTF8 and can be customized by setting the property key.serializer.encoding, value.serializer.encoding or serializer.encoding.
    - -
    + +
    A structured record containing a set of named fields with values, each field using an independent Schema.
    - -
    + +
    Represents a range of versions that a particular broker supports for some feature.
    - -
     
    - +
     
    - -
    + +
     
    + +
    Marker interface for a buffer configuration that will strictly enforce size constraints (bytes and/or number of records) on the buffer, so it is suitable for reducing duplicate results downstream, but does not promise to eliminate them entirely.
    - -
    + +
    Marker interface for a buffer configuration that is "strict" in the sense that it will strictly enforce the time bound and never emit early.
    - -
    + +
    The TableJoined class represents optional parameters that can be passed to KTable#join(KTable,Function,...) and KTable#leftJoin(KTable,Function,...) operations, for foreign key joins.
    - -
    + +
    Tasks contain the code that actually copies data to/from another system.
    - -
    + +
    Indicates a run time error incurred while trying to assign stream tasks to threads.
    - -
    + +
    Indicates a specific task is corrupted and need to be re-initialized.
    - -
    + +
    The task ID representation composed as subtopology (aka topicGroupId) plus the assigned partition ID.
    - -
    + +
    Indicates a run time error incurred while trying parse the task id from the read string.
    - -
    Deprecated. + +
    Deprecated.
    since 3.0, use TaskMetadata instead.
    - -
    + +
    Metadata of a task.
    - -
    + +
    Indicates that all tasks belongs to the thread have migrated to another thread.
    - -
    + +
    Describes the state, IDs, and any errors of a connector task.
    - -
    + +
    TestInputTopic is used to pipe records to topic in TopologyTestDriver.
    - -
    + +
    TestOutputTopic is used to read records from a topic in TopologyTestDriver.
    - -
    + +
    A key/value pair, including timestamp and record headers, to be sent to or received from TopologyTestDriver.
    - -
    Deprecated. + +
    Deprecated.
    since 3.0 use ThreadMetadata instead
    - -
    + +
    Metadata of a stream thread.
    - -
    + +
    Exception thrown if an operation on a resource exceeds the throttling quota.
    - -
    + +
    A time representing a specific point in a day, not tied to any specific date.
    - -
    + +
    Indicates that a request timed out.
    - -
    + +
    A timestamp representing an absolute time, without timezone information.
    - -
     
    - -
    + +
     
    + +
    A key-(value/timestamp) store that supports put/get/delete and range queries.
    - -
    + +
    Interface for storing the aggregated values of fixed-size time windows.
    - -
    + +
    An interface that allows the Kafka Streams framework to extract a timestamp from an instance of ConsumerRecord.
    - -
    + +
    TimeWindowedCogroupKStream is an abstraction of a windowed record stream of KeyValue pairs.
    - -
     
    - -
    + +
     
    + +
    TimeWindowedKStream is an abstraction of a windowed record stream of KeyValue pairs.
    - -
     
    - -
    + +
     
    + +
    The fixed-size time-based window specifications used for aggregations.
    - -
    + +
    This class is used to provide the optional parameters when sending output records to downstream processor using ProcessorContext.forward(Object, Object, To).
    - -
    + +
    The TokenBucket is a MeasurableStat implementing a token bucket algorithm that is usable within a Sensor.
    - -
    + +
    A class representing a delegation token details.
    - -
     
    - -
    + +
     
    + +
    A class used to represent a collection of topics.
    - -
    + +
    A class used to represent a collection of topics defined by their topic ID.
    - -
    + +
    A class used to represent a collection of topics defined by their topic name.
    - -
    + +
    Keys that can be used to configure a topic.
    - -
     
    - -
    + +
     
    + +
    A detailed description of a single topic in the cluster.
    - -
     
    - -
    + +
     
    + +
    This represents universally unique identifier with topic id for a topic partition.
    - -
    + +
    A listing of a topic in the cluster.
    - -
    + +
    An interface that allows to dynamically determine the name of the Kafka topic to send at the sink node of the topology.
    - -
    + +
    A topic name and partition number
    - -
    + +
    A class containing leadership, replicas and ISR information for a topic partition.
    - -
    + +
    The topic name, partition number and the brokerId of the replica
    - -
    + +
    A logical representation of a ProcessorTopology.
    - -
    + +
    Sets the auto.offset.reset configuration when adding a source processor or when creating KStream or KTable via StreamsBuilder.
    - -
    + +
    A meta representation of a topology.
    - -
    + +
    Represents a global store.
    - -
    + +
    A node of a topology.
    - -
    + +
    A processor node of a topology.
    - -
    + +
    A sink node of a topology.
    - -
    + +
    A source node of a topology.
    - -
    + +
    A connected sub-graph of a Topology.
    - -
    + +
    Indicates a pre run time error occurred while parsing the logical topology to construct the physical processor topology.
    - -
    + +
    This class makes it easier to write tests to verify the behavior of topologies created with Topology or StreamsBuilder.
    - -
    + +
    This is the Exception thrown when we are aborting any undrained batches during a transaction which is aborted without any underlying cause - which likely means that the user chose to abort.
    - -
     
    - +
     
    - +
     
    - -
     
    - + +
     
    +
     
    - -
     
    - -
    + +
     
    + +
     
    + +
    Single message transformation for Kafka Connect record types.
    - -
    + +
    The Transformer interface is for stateful mapping of an input record to zero, one, or multiple new output records (both key and value type can be altered arbitrarily).
    - -
    + +
    A TransformerSupplier interface which can create one or more Transformer instances.
    - -
    + +
    Exception thrown when attempting to define a credential that does not meet the criteria for acceptability (for example, attempting to create a SCRAM credential with an empty username or password or too few/many iterations).
    - -
    + +
    The partitioning strategy: If a partition is specified in the record, use it Otherwise choose the sticky partition that changes when the batch is full.
    - -
    + +
    The request contained a leader epoch which is larger than that on the broker that received the request.
    - -
     
    - -
    + +
     
    + +
    This exception is raised by the broker if it could not locate the producer metadata associated with the producerId in question.
    - -
    + +
    An error occurred on the server for which the client doesn't have a corresponding error code.
    - -
    + +
    Indicates that the state store being queried is unknown, i.e., the state store does either not exist in your topology or it is not queryable.
    - -
     
    - -
    + +
     
    + +
    This topic/partition doesn't exist.
    - -
    + +
    The unlimited window specifications used for aggregations.
    - -
    + + - -
    + + + +
     
    Exception thrown when there are unstable offsets for the requested topic partitions.
    @@ -2849,54 +2958,70 @@

    All Classes and Interfaces<
    We are converting UUID to String before serializing.
    - -
    -
    An instantaneous value.
    + +
    +
    ValidateException is thrown in cases where a JWT access token cannot be determined to be + valid for one reason or another.
    - + +
    +
    ValidatorAccessTokenValidator is an implementation of AccessTokenValidator that is used + by the broker to perform more extensive validation of the JWT access token that is received + from the client, but ultimately from posting the client credentials to the OAuth/OIDC provider's + token endpoint.
    +
    + +
     
    +
    +
    An instantaneous value.
    +
    + +
    Combines a value from a KeyValue with a timestamp.
    - -
    + +
    The ValueJoiner interface for joining two values into a new value of arbitrary type.
    - -
    + +
    The ValueJoinerWithKey interface for joining two values into a new value of arbitrary type.
    - -
    + +
    The ValueMapper interface for mapping a value to a new value of arbitrary type.
    - -
    + +
    The ValueMapperWithKey interface for mapping a value to a new value of arbitrary type.
    - -
    + +
    Utility for converting from one Connect value to a different form.
    - -
     
    - +
     
    - -
    + +
     
    + +
    The ValueTransformer interface for stateful mapping of a value to a new value (with possible new type).
    - -
    + +
    A ValueTransformerSupplier interface which can create one or more ValueTransformer instances.
    - -
    + +
    The ValueTransformerWithKey interface for stateful mapping of a value to a new value (with possible new type).
    - -
    + +
    A ValueTransformerWithKeySupplier interface which can create one or more ValueTransformerWithKey instances.
    + +
     
    Connect requires some components implement this interface to define a version string.
    diff --git a/31/javadoc/allclasses.html b/31/javadoc/allclasses.html new file mode 100644 index 000000000..02396dcdb --- /dev/null +++ b/31/javadoc/allclasses.html @@ -0,0 +1,825 @@ + + + + + +All Classes (kafka 3.1.1 API) + + + + + + + + + + + +

    All Classes

    +
    + +
    + + diff --git a/31/javadoc/allpackages-index.html b/31/javadoc/allpackages-index.html index 71fcd6229..b4ef08cf4 100644 --- a/31/javadoc/allpackages-index.html +++ b/31/javadoc/allpackages-index.html @@ -2,7 +2,7 @@ -All Packages (kafka 3.1.0 API) +All Packages (kafka 3.1.1 API) @@ -86,66 +86,68 @@

    All Packages

     
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    + +
     
    diff --git a/31/javadoc/constant-values.html b/31/javadoc/constant-values.html index 584166023..2b1c8d4ca 100644 --- a/31/javadoc/constant-values.html +++ b/31/javadoc/constant-values.html @@ -2,7 +2,7 @@ -Constant Field Values (kafka 3.1.0 API) +Constant Field Values (kafka 3.1.1 API) @@ -221,7 +221,7 @@

    org.apache.*

    "isolation.level"
    public static final String
    -
    "Controls how to read messages written transactionally. If set to <code>read_committed</code>, consumer.poll() will only return transactional messages which have been committed. If set to <code>read_uncommitted</code> (the default), consumer.poll() will return all messages, even transactional messages which have been aborted. Non-transactional messages will be returned unconditionally in either mode. <p>Messages will always be returned in offset order. Hence, in <code>read_committed</code> mode, consumer.poll() will only return messages up to the last stable offset (LSO), which is the one less than the offset of the first open transaction. In particular any messages appearing after messages belonging to ongoing transactions will be withheld until the relevant transaction has been completed. As a result, <code>read_committed</code> consumers will not be able to read up to the high watermark when there are in flight transactions.</p><p> Further, when in <code>read_committed</code> the seekToEnd method will return the LSO"
    +
    "Controls how to read messages written transactionally. If set to <code>read_committed</code>, consumer.poll() will only return transactional messages which have been committed. If set to <code>read_uncommitted</code> (the default), consumer.poll() will return all messages, even transactional messages which have been aborted. Non-transactional messages will be returned unconditionally in either mode. <p>Messages will always be returned in offset order. Hence, in <code>read_committed</code> mode, consumer.poll() will only return messages up to the last stable offset (LSO), which is the one less than the offset of the first open transaction. In particular any messages appearing after messages belonging to ongoing transactions will be withheld until the relevant transaction has been completed. As a result, <code>read_committed</code> consumers will not be able to read up to the high watermark when there are in flight transactions.</p><p> Further, when in <code>read_committed</code> the seekToEnd method will return the LSO</p>"
    public static final String
    "key.deserializer"
    @@ -408,7 +408,7 @@

    org.apache.*

    "enable.idempotence"
    public static final String
    -
    "When set to \'true\', the producer will ensure that exactly one copy of each message is written in the stream. If \'false\', producer retries due to broker failures, etc., may write duplicates of the retried message in the stream. Note that enabling idempotence requires <code>max.in.flight.requests.per.connection</code> to be less than or equal to 5 (with message ordering preserved for any allowable value), <code>retries</code> to be greater than 0, and <code>acks</code> must be \'all\'. If these values are not explicitly set by the user, suitable values will be chosen. If incompatible values are set, a <code>ConfigException</code> will be thrown."
    +
    "When set to \'true\', the producer will ensure that exactly one copy of each message is written in the stream. If \'false\', producer retries due to broker failures, etc., may write duplicates of the retried message in the stream. Note that enabling idempotence requires <code>max.in.flight.requests.per.connection</code> to be less than or equal to 5 (with message ordering preserved for any allowable value), <code>retries</code> to be greater than 0, and <code>acks</code> must be \'all\'. <p>Idempotence is enabled by default if no conflicting configurations are set. If conflicting configurations are set and idempotence is not explicitly enabled, idempotence is disabled. If idempotence is explicitly enabled and conflicting configurations are set, a <code>ConfigException</code> is thrown."
    public static final String
    "interceptor.classes"
    @@ -1219,6 +1219,59 @@

    org.apache.*

    • +
      org.apache.kafka.common.security.oauthbearer.secured.HttpAccessTokenRetriever
      +
      +
      Modifier and Type
      +
      Constant Field
      +
      Value
      +
      public static final String
      + +
      "Authorization"
      +
      +
    • +
    • +
      org.apache.kafka.common.security.oauthbearer.secured.LoginAccessTokenValidator
      +
      +
      Modifier and Type
      +
      Constant Field
      +
      Value
      +
      public static final String
      + +
      "exp"
      +
      public static final String
      + +
      "iat"
      +
      +
    • +
    • +
      org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler
      +
      +
      Modifier and Type
      +
      Constant Field
      +
      Value
      +
      public static final String
      + +
      "clientId"
      +
      public static final String
      + +
      "The OAuth/OIDC identity provider-issued client ID to uniquely identify the service account to use for authentication for this client. The value must be paired with a corresponding clientSecret value and is provided to the OAuth provider using the OAuth clientcredentials grant type."
      +
      public static final String
      + +
      "clientSecret"
      +
      public static final String
      + +
      "The OAuth/OIDC identity provider-issued client secret serves a similar function as a password to the clientId account and identifies the service account to use for authentication for this client. The value must be paired with a corresponding clientId value and is provided to the OAuth provider using the OAuth clientcredentials grant type."
      +
      public static final String
      + +
      "scope"
      +
      public static final String
      + +
      "The (optional) HTTP/HTTPS login request to the token endpoint (sasl.oauthbearer.token.endpoint.url) may need to specify an OAuth \"scope\". If so, the scope is used to provide the value to include with the login request."
      +
      +
    • +
    +
    +
    retrieve() - Method in interface org.apache.kafka.common.security.oauthbearer.secured.AccessTokenRetriever
    +
    +
    Retrieves a JWT access token in its serialized three-part form.
    +
    +
    retrieve() - Method in class org.apache.kafka.common.security.oauthbearer.secured.FileTokenRetriever
    +
     
    +
    retrieve() - Method in class org.apache.kafka.common.security.oauthbearer.secured.HttpAccessTokenRetriever
    +
    +
    Retrieves a JWT access token in its serialized three-part form.
    +
    +
    Retry<R> - Class in org.apache.kafka.common.security.oauthbearer.secured
    +
    +
    Retry encapsulates the mechanism to perform a retry and then exponential + backoff using provided wait times between attempts.
    +
    +
    Retry(long, long) - Constructor for class org.apache.kafka.common.security.oauthbearer.secured.Retry
    +
     
    +
    Retry(Time, long, long) - Constructor for class org.apache.kafka.common.security.oauthbearer.secured.Retry
    +
     
    RETRY_BACKOFF_MS_CONFIG - Static variable in class org.apache.kafka.clients.admin.AdminClientConfig
    retry.backoff.ms
    @@ -12980,6 +13270,10 @@

    R

    retry.backoff.ms
    +
    Retryable<R> - Interface in org.apache.kafka.common.security.oauthbearer.secured
    +
    +
    Simple interface to abstract out the call that is made so that it can be retried.
    +
    retryOnQuotaViolation(boolean) - Method in class org.apache.kafka.clients.admin.CreatePartitionsOptions
    Set to true if quota violation should be automatically retried.
    @@ -13325,6 +13619,16 @@

    S

    RFC 6749 Section 1.4
    +
    scope() - Method in class org.apache.kafka.common.security.oauthbearer.secured.BasicOAuthBearerToken
    +
    +
    The token's scope of access, as per + RFC 6749 Section + 1.4
    +
    +
    SCOPE_CONFIG - Static variable in class org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler
    +
     
    +
    SCOPE_DOC - Static variable in class org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler
    +
     
    SCRAM_SHA_256 - Enum constant in enum class org.apache.kafka.clients.admin.ScramMechanism
     
    SCRAM_SHA_512 - Enum constant in enum class org.apache.kafka.clients.admin.ScramMechanism
    @@ -13711,6 +14015,13 @@

    S

     
    serializeBaseKey(String, Windowed<T>) - Method in class org.apache.kafka.streams.kstream.TimeWindowedSerializer
     
    +
    SerializedJwt - Class in org.apache.kafka.common.security.oauthbearer.secured
    +
    +
    SerializedJwt provides a modicum of structure and validation around a JWT's serialized form by + splitting and making the three sections (header, payload, and signature) available to the user.
    +
    +
    SerializedJwt(String) - Constructor for class org.apache.kafka.common.security.oauthbearer.secured.SerializedJwt
    +
     
    serializedKeySize() - Method in class org.apache.kafka.clients.consumer.ConsumerRecord
    The size of the serialized, uncompressed key in bytes.
    @@ -13911,6 +14222,8 @@

    S

    Returns true if SSLEngine needs to be rebuilt.
    +
    shouldCreateSSLSocketFactory(URL) - Method in class org.apache.kafka.common.security.oauthbearer.secured.JaasOptionsUtils
    +
     
    shouldListInternal() - Method in class org.apache.kafka.clients.admin.ListTopicsOptions
    Return true if we should list internal topics.
    @@ -14343,6 +14656,11 @@

    S

    When the credential became valid, in terms of the number of milliseconds since the epoch, if known, otherwise null.
    +
    startTimeMs() - Method in class org.apache.kafka.common.security.oauthbearer.secured.BasicOAuthBearerToken
    +
    +
    When the credential became valid, in terms of the number of milliseconds + since the epoch, if known, otherwise null.
    +
    stat - Variable in class org.apache.kafka.common.metrics.stats.Rate
     
    stat() - Method in class org.apache.kafka.common.metrics.CompoundStat.NamedMeasurable
    @@ -16152,6 +16470,8 @@

    T

     
    toString() - Method in class org.apache.kafka.common.security.auth.SaslExtensions
     
    +
    toString() - Method in class org.apache.kafka.common.security.oauthbearer.secured.BasicOAuthBearerToken
    +
     
    toString() - Method in class org.apache.kafka.common.security.token.delegation.DelegationToken
     
    toString() - Method in class org.apache.kafka.common.security.token.delegation.TokenInformation
    @@ -16681,6 +17001,14 @@

    U

    +
    UnretryableException - Exception in org.apache.kafka.common.security.oauthbearer.secured
    +
     
    +
    UnretryableException(String) - Constructor for exception org.apache.kafka.common.security.oauthbearer.secured.UnretryableException
    +
     
    +
    UnretryableException(String, Throwable) - Constructor for exception org.apache.kafka.common.security.oauthbearer.secured.UnretryableException
    +
     
    +
    UnretryableException(Throwable) - Constructor for exception org.apache.kafka.common.security.oauthbearer.secured.UnretryableException
    +
     
    UnstableOffsetCommitException - Exception in org.apache.kafka.common.errors
    Exception thrown when there are unstable offsets for the requested topic partitions.
    @@ -16980,6 +17308,21 @@

    V

    Validates that this struct has filled in all the necessary data with valid values.
    +
    validate(String) - Method in interface org.apache.kafka.common.security.oauthbearer.secured.AccessTokenValidator
    +
    +
    Accepts an OAuth JWT access token in base-64 encoded format, validates, and returns an + OAuthBearerToken.
    +
    +
    validate(String) - Method in class org.apache.kafka.common.security.oauthbearer.secured.LoginAccessTokenValidator
    +
    +
    Accepts an OAuth JWT access token in base-64 encoded format, validates, and returns an + OAuthBearerToken.
    +
    +
    validate(String) - Method in class org.apache.kafka.common.security.oauthbearer.secured.ValidatorAccessTokenValidator
    +
    +
    Accepts an OAuth JWT access token in base-64 encoded format, validates, and returns an + OAuthBearerToken.
    +
    validate(String) - Static method in class org.apache.kafka.streams.kstream.Named
     
    validate(Map<String, String>) - Method in class org.apache.kafka.common.config.ConfigDef
    @@ -17008,8 +17351,89 @@

    V

    validateAll(Map<String, String>) - Method in class org.apache.kafka.common.config.ConfigDef
     
    +
    validateClaimNameOverride(String, String) - Static method in class org.apache.kafka.common.security.oauthbearer.secured.ClaimValidationUtils
    +
    +
    Validates that the given claim name override is valid, where invalid means + any of the following: + + + null + Zero length + Whitespace only +
    +
    validatedExtensions() - Method in class org.apache.kafka.common.security.oauthbearer.OAuthBearerExtensionsValidatorCallback
     
    +
    ValidateException - Exception in org.apache.kafka.common.security.oauthbearer.secured
    +
    +
    ValidateException is thrown in cases where a JWT access token cannot be determined to be + valid for one reason or another.
    +
    +
    ValidateException(String) - Constructor for exception org.apache.kafka.common.security.oauthbearer.secured.ValidateException
    +
     
    +
    ValidateException(String, Throwable) - Constructor for exception org.apache.kafka.common.security.oauthbearer.secured.ValidateException
    +
     
    +
    ValidateException(Throwable) - Constructor for exception org.apache.kafka.common.security.oauthbearer.secured.ValidateException
    +
     
    +
    validateExpiration(String, Long) - Static method in class org.apache.kafka.common.security.oauthbearer.secured.ClaimValidationUtils
    +
    +
    Validates that the given lifetime is valid, where invalid means any of + the following: + + + null + Negative +
    +
    +
    validateFile(String) - Method in class org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
    +
    +
    Validates that, if a value is supplied, is a file that: + + + exists + has read permission + points to a file + + + If the value is null or an empty string, it is assumed to be an "empty" value and thus.
    +
    +
    validateInteger(String, boolean) - Method in class org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
    +
    +
    Validates that, if a value is supplied, is a value that: + + + is an Integer + has a value that is not less than the provided minimum value + + + If the value is null or an empty string, it is assumed to be an "empty" value and thus + ignored.
    +
    +
    validateIssuedAt(String, Long) - Static method in class org.apache.kafka.common.security.oauthbearer.secured.ClaimValidationUtils
    +
    +
    Validates that the given issued at claim name is valid, where invalid means any of + the following: + + + Negative +
    +
    +
    validateLong(String) - Method in class org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
    +
    +
    Validates that, if a value is supplied, is a value that: + + + is an Integer + has a value that is not less than the provided minimum value + + + If the value is null or an empty string, it is assumed to be an "empty" value and thus + ignored.
    +
    +
    validateLong(String, boolean) - Method in class org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
    +
     
    +
    validateLong(String, boolean, Long) - Method in class org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
    +
     
    validateOnly() - Method in class org.apache.kafka.clients.admin.AlterClientQuotasOptions
    Returns whether the request should be validated without altering the configs.
    @@ -17042,6 +17466,50 @@

    V

    Validates the provided configuration.
    +
    validateScopes(String, Collection<String>) - Static method in class org.apache.kafka.common.security.oauthbearer.secured.ClaimValidationUtils
    +
    +
    Validates that the scopes are valid, where invalid means any of + the following: + + + Collection is null + Collection has duplicates + Any of the elements in the collection are null + Any of the elements in the collection are zero length + Any of the elements in the collection are whitespace only +
    +
    +
    validateString(String) - Method in class org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
    +
     
    +
    validateString(String) - Method in class org.apache.kafka.common.security.oauthbearer.secured.JaasOptionsUtils
    +
     
    +
    validateString(String, boolean) - Method in class org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
    +
     
    +
    validateString(String, boolean) - Method in class org.apache.kafka.common.security.oauthbearer.secured.JaasOptionsUtils
    +
     
    +
    validateSubject(String, String) - Static method in class org.apache.kafka.common.security.oauthbearer.secured.ClaimValidationUtils
    +
    +
    Validates that the given claim value is valid, where invalid means any of + the following: + + + null + Zero length + Whitespace only +
    +
    +
    validateUrl(String) - Method in class org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
    +
    +
    Validates that the configured URL that: + + + is well-formed + contains a scheme + uses either HTTP, HTTPS, or file protocols + + + No effort is made to connect to the URL in the validation step.
    +
    validateValue(Object) - Method in class org.apache.kafka.connect.data.ConnectSchema
    Validate that the value can be used for this schema, i.e.
    @@ -17054,6 +17522,20 @@

    V

    validator - Variable in class org.apache.kafka.common.config.ConfigDef.ConfigKey
     
    +
    ValidatorAccessTokenValidator - Class in org.apache.kafka.common.security.oauthbearer.secured
    +
    +
    ValidatorAccessTokenValidator is an implementation of AccessTokenValidator that is used + by the broker to perform more extensive validation of the JWT access token that is received + from the client, but ultimately from posting the client credentials to the OAuth/OIDC provider's + token endpoint.
    +
    +
    ValidatorAccessTokenValidator(Integer, Set<String>, String, VerificationKeyResolver, String, String) - Constructor for class org.apache.kafka.common.security.oauthbearer.secured.ValidatorAccessTokenValidator
    +
    +
    Creates a new ValidatorAccessTokenValidator that will be used by the broker for more + thorough validation of the JWT.
    +
    +
    ValidatorAccessTokenValidator.ClaimSupplier<T> - Interface in org.apache.kafka.common.security.oauthbearer.secured
    +
     
    validValues(String, Map<String, Object>) - Method in interface org.apache.kafka.common.config.ConfigDef.Recommender
    The valid values for the configuration given the current configuration values.
    @@ -17094,6 +17576,12 @@

    V

    RFC 6750 Section 2.1
    +
    value() - Method in class org.apache.kafka.common.security.oauthbearer.secured.BasicOAuthBearerToken
    +
    +
    The b64token value as defined in + RFC 6750 Section + 2.1
    +
    value() - Method in class org.apache.kafka.connect.connector.ConnectRecord
     
    value() - Method in class org.apache.kafka.connect.data.SchemaAndValue
    @@ -17698,6 +18186,10 @@

    V

    A ValueTransformerWithKeySupplier interface which can create one or more ValueTransformerWithKey instances.
    +
    VerificationKeyResolverFactory - Class in org.apache.kafka.common.security.oauthbearer.secured
    +
     
    +
    VerificationKeyResolverFactory() - Constructor for class org.apache.kafka.common.security.oauthbearer.secured.VerificationKeyResolverFactory
    +
     
    version() - Method in interface org.apache.kafka.clients.consumer.ConsumerPartitionAssignor
    Return the version of the assignor which indicates how the user metadata encodings diff --git a/31/javadoc/index.html b/31/javadoc/index.html index 8a031f97c..51f18f9cd 100644 --- a/31/javadoc/index.html +++ b/31/javadoc/index.html @@ -2,7 +2,7 @@ -Overview (kafka 3.1.0 API) +Overview (kafka 3.1.1 API) @@ -48,7 +48,7 @@
    -

    kafka 3.1.0 API

    +

    kafka 3.1.1 API

    Packages
    @@ -87,66 +87,68 @@

    kafka 3.1.0 API

     
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    - +
     
    + +
     
    diff --git a/31/javadoc/jquery/external/jquery/jquery.js b/31/javadoc/jquery/external/jquery/jquery.js new file mode 100644 index 000000000..50937333b --- /dev/null +++ b/31/javadoc/jquery/external/jquery/jquery.js @@ -0,0 +1,10872 @@ +/*! + * jQuery JavaScript Library v3.5.1 + * https://jquery.com/ + * + * Includes Sizzle.js + * https://sizzlejs.com/ + * + * Copyright JS Foundation and other contributors + * Released under the MIT license + * https://jquery.org/license + * + * Date: 2020-05-04T22:49Z + */ +( function( global, factory ) { + + "use strict"; + + if ( typeof module === "object" && typeof module.exports === "object" ) { + + // For CommonJS and CommonJS-like environments where a proper `window` + // is present, execute the factory and get jQuery. + // For environments that do not have a `window` with a `document` + // (such as Node.js), expose a factory as module.exports. + // This accentuates the need for the creation of a real `window`. + // e.g. var jQuery = require("jquery")(window); + // See ticket #14549 for more info. + module.exports = global.document ? + factory( global, true ) : + function( w ) { + if ( !w.document ) { + throw new Error( "jQuery requires a window with a document" ); + } + return factory( w ); + }; + } else { + factory( global ); + } + +// Pass this if window is not defined yet +} )( typeof window !== "undefined" ? window : this, function( window, noGlobal ) { + +// Edge <= 12 - 13+, Firefox <=18 - 45+, IE 10 - 11, Safari 5.1 - 9+, iOS 6 - 9.1 +// throw exceptions when non-strict code (e.g., ASP.NET 4.5) accesses strict mode +// arguments.callee.caller (trac-13335). But as of jQuery 3.0 (2016), strict mode should be common +// enough that all such attempts are guarded in a try block. +"use strict"; + +var arr = []; + +var getProto = Object.getPrototypeOf; + +var slice = arr.slice; + +var flat = arr.flat ? function( array ) { + return arr.flat.call( array ); +} : function( array ) { + return arr.concat.apply( [], array ); +}; + + +var push = arr.push; + +var indexOf = arr.indexOf; + +var class2type = {}; + +var toString = class2type.toString; + +var hasOwn = class2type.hasOwnProperty; + +var fnToString = hasOwn.toString; + +var ObjectFunctionString = fnToString.call( Object ); + +var support = {}; + +var isFunction = function isFunction( obj ) { + + // Support: Chrome <=57, Firefox <=52 + // In some browsers, typeof returns "function" for HTML elements + // (i.e., `typeof document.createElement( "object" ) === "function"`). + // We don't want to classify *any* DOM node as a function. + return typeof obj === "function" && typeof obj.nodeType !== "number"; + }; + + +var isWindow = function isWindow( obj ) { + return obj != null && obj === obj.window; + }; + + +var document = window.document; + + + + var preservedScriptAttributes = { + type: true, + src: true, + nonce: true, + noModule: true + }; + + function DOMEval( code, node, doc ) { + doc = doc || document; + + var i, val, + script = doc.createElement( "script" ); + + script.text = code; + if ( node ) { + for ( i in preservedScriptAttributes ) { + + // Support: Firefox 64+, Edge 18+ + // Some browsers don't support the "nonce" property on scripts. + // On the other hand, just using `getAttribute` is not enough as + // the `nonce` attribute is reset to an empty string whenever it + // becomes browsing-context connected. + // See https://github.com/whatwg/html/issues/2369 + // See https://html.spec.whatwg.org/#nonce-attributes + // The `node.getAttribute` check was added for the sake of + // `jQuery.globalEval` so that it can fake a nonce-containing node + // via an object. + val = node[ i ] || node.getAttribute && node.getAttribute( i ); + if ( val ) { + script.setAttribute( i, val ); + } + } + } + doc.head.appendChild( script ).parentNode.removeChild( script ); + } + + +function toType( obj ) { + if ( obj == null ) { + return obj + ""; + } + + // Support: Android <=2.3 only (functionish RegExp) + return typeof obj === "object" || typeof obj === "function" ? + class2type[ toString.call( obj ) ] || "object" : + typeof obj; +} +/* global Symbol */ +// Defining this global in .eslintrc.json would create a danger of using the global +// unguarded in another place, it seems safer to define global only for this module + + + +var + version = "3.5.1", + + // Define a local copy of jQuery + jQuery = function( selector, context ) { + + // The jQuery object is actually just the init constructor 'enhanced' + // Need init if jQuery is called (just allow error to be thrown if not included) + return new jQuery.fn.init( selector, context ); + }; + +jQuery.fn = jQuery.prototype = { + + // The current version of jQuery being used + jquery: version, + + constructor: jQuery, + + // The default length of a jQuery object is 0 + length: 0, + + toArray: function() { + return slice.call( this ); + }, + + // Get the Nth element in the matched element set OR + // Get the whole matched element set as a clean array + get: function( num ) { + + // Return all the elements in a clean array + if ( num == null ) { + return slice.call( this ); + } + + // Return just the one element from the set + return num < 0 ? this[ num + this.length ] : this[ num ]; + }, + + // Take an array of elements and push it onto the stack + // (returning the new matched element set) + pushStack: function( elems ) { + + // Build a new jQuery matched element set + var ret = jQuery.merge( this.constructor(), elems ); + + // Add the old object onto the stack (as a reference) + ret.prevObject = this; + + // Return the newly-formed element set + return ret; + }, + + // Execute a callback for every element in the matched set. + each: function( callback ) { + return jQuery.each( this, callback ); + }, + + map: function( callback ) { + return this.pushStack( jQuery.map( this, function( elem, i ) { + return callback.call( elem, i, elem ); + } ) ); + }, + + slice: function() { + return this.pushStack( slice.apply( this, arguments ) ); + }, + + first: function() { + return this.eq( 0 ); + }, + + last: function() { + return this.eq( -1 ); + }, + + even: function() { + return this.pushStack( jQuery.grep( this, function( _elem, i ) { + return ( i + 1 ) % 2; + } ) ); + }, + + odd: function() { + return this.pushStack( jQuery.grep( this, function( _elem, i ) { + return i % 2; + } ) ); + }, + + eq: function( i ) { + var len = this.length, + j = +i + ( i < 0 ? len : 0 ); + return this.pushStack( j >= 0 && j < len ? [ this[ j ] ] : [] ); + }, + + end: function() { + return this.prevObject || this.constructor(); + }, + + // For internal use only. + // Behaves like an Array's method, not like a jQuery method. + push: push, + sort: arr.sort, + splice: arr.splice +}; + +jQuery.extend = jQuery.fn.extend = function() { + var options, name, src, copy, copyIsArray, clone, + target = arguments[ 0 ] || {}, + i = 1, + length = arguments.length, + deep = false; + + // Handle a deep copy situation + if ( typeof target === "boolean" ) { + deep = target; + + // Skip the boolean and the target + target = arguments[ i ] || {}; + i++; + } + + // Handle case when target is a string or something (possible in deep copy) + if ( typeof target !== "object" && !isFunction( target ) ) { + target = {}; + } + + // Extend jQuery itself if only one argument is passed + if ( i === length ) { + target = this; + i--; + } + + for ( ; i < length; i++ ) { + + // Only deal with non-null/undefined values + if ( ( options = arguments[ i ] ) != null ) { + + // Extend the base object + for ( name in options ) { + copy = options[ name ]; + + // Prevent Object.prototype pollution + // Prevent never-ending loop + if ( name === "__proto__" || target === copy ) { + continue; + } + + // Recurse if we're merging plain objects or arrays + if ( deep && copy && ( jQuery.isPlainObject( copy ) || + ( copyIsArray = Array.isArray( copy ) ) ) ) { + src = target[ name ]; + + // Ensure proper type for the source value + if ( copyIsArray && !Array.isArray( src ) ) { + clone = []; + } else if ( !copyIsArray && !jQuery.isPlainObject( src ) ) { + clone = {}; + } else { + clone = src; + } + copyIsArray = false; + + // Never move original objects, clone them + target[ name ] = jQuery.extend( deep, clone, copy ); + + // Don't bring in undefined values + } else if ( copy !== undefined ) { + target[ name ] = copy; + } + } + } + } + + // Return the modified object + return target; +}; + +jQuery.extend( { + + // Unique for each copy of jQuery on the page + expando: "jQuery" + ( version + Math.random() ).replace( /\D/g, "" ), + + // Assume jQuery is ready without the ready module + isReady: true, + + error: function( msg ) { + throw new Error( msg ); + }, + + noop: function() {}, + + isPlainObject: function( obj ) { + var proto, Ctor; + + // Detect obvious negatives + // Use toString instead of jQuery.type to catch host objects + if ( !obj || toString.call( obj ) !== "[object Object]" ) { + return false; + } + + proto = getProto( obj ); + + // Objects with no prototype (e.g., `Object.create( null )`) are plain + if ( !proto ) { + return true; + } + + // Objects with prototype are plain iff they were constructed by a global Object function + Ctor = hasOwn.call( proto, "constructor" ) && proto.constructor; + return typeof Ctor === "function" && fnToString.call( Ctor ) === ObjectFunctionString; + }, + + isEmptyObject: function( obj ) { + var name; + + for ( name in obj ) { + return false; + } + return true; + }, + + // Evaluates a script in a provided context; falls back to the global one + // if not specified. + globalEval: function( code, options, doc ) { + DOMEval( code, { nonce: options && options.nonce }, doc ); + }, + + each: function( obj, callback ) { + var length, i = 0; + + if ( isArrayLike( obj ) ) { + length = obj.length; + for ( ; i < length; i++ ) { + if ( callback.call( obj[ i ], i, obj[ i ] ) === false ) { + break; + } + } + } else { + for ( i in obj ) { + if ( callback.call( obj[ i ], i, obj[ i ] ) === false ) { + break; + } + } + } + + return obj; + }, + + // results is for internal usage only + makeArray: function( arr, results ) { + var ret = results || []; + + if ( arr != null ) { + if ( isArrayLike( Object( arr ) ) ) { + jQuery.merge( ret, + typeof arr === "string" ? + [ arr ] : arr + ); + } else { + push.call( ret, arr ); + } + } + + return ret; + }, + + inArray: function( elem, arr, i ) { + return arr == null ? -1 : indexOf.call( arr, elem, i ); + }, + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + merge: function( first, second ) { + var len = +second.length, + j = 0, + i = first.length; + + for ( ; j < len; j++ ) { + first[ i++ ] = second[ j ]; + } + + first.length = i; + + return first; + }, + + grep: function( elems, callback, invert ) { + var callbackInverse, + matches = [], + i = 0, + length = elems.length, + callbackExpect = !invert; + + // Go through the array, only saving the items + // that pass the validator function + for ( ; i < length; i++ ) { + callbackInverse = !callback( elems[ i ], i ); + if ( callbackInverse !== callbackExpect ) { + matches.push( elems[ i ] ); + } + } + + return matches; + }, + + // arg is for internal usage only + map: function( elems, callback, arg ) { + var length, value, + i = 0, + ret = []; + + // Go through the array, translating each of the items to their new values + if ( isArrayLike( elems ) ) { + length = elems.length; + for ( ; i < length; i++ ) { + value = callback( elems[ i ], i, arg ); + + if ( value != null ) { + ret.push( value ); + } + } + + // Go through every key on the object, + } else { + for ( i in elems ) { + value = callback( elems[ i ], i, arg ); + + if ( value != null ) { + ret.push( value ); + } + } + } + + // Flatten any nested arrays + return flat( ret ); + }, + + // A global GUID counter for objects + guid: 1, + + // jQuery.support is not used in Core but other projects attach their + // properties to it so it needs to exist. + support: support +} ); + +if ( typeof Symbol === "function" ) { + jQuery.fn[ Symbol.iterator ] = arr[ Symbol.iterator ]; +} + +// Populate the class2type map +jQuery.each( "Boolean Number String Function Array Date RegExp Object Error Symbol".split( " " ), +function( _i, name ) { + class2type[ "[object " + name + "]" ] = name.toLowerCase(); +} ); + +function isArrayLike( obj ) { + + // Support: real iOS 8.2 only (not reproducible in simulator) + // `in` check used to prevent JIT error (gh-2145) + // hasOwn isn't used here due to false negatives + // regarding Nodelist length in IE + var length = !!obj && "length" in obj && obj.length, + type = toType( obj ); + + if ( isFunction( obj ) || isWindow( obj ) ) { + return false; + } + + return type === "array" || length === 0 || + typeof length === "number" && length > 0 && ( length - 1 ) in obj; +} +var Sizzle = +/*! + * Sizzle CSS Selector Engine v2.3.5 + * https://sizzlejs.com/ + * + * Copyright JS Foundation and other contributors + * Released under the MIT license + * https://js.foundation/ + * + * Date: 2020-03-14 + */ +( function( window ) { +var i, + support, + Expr, + getText, + isXML, + tokenize, + compile, + select, + outermostContext, + sortInput, + hasDuplicate, + + // Local document vars + setDocument, + document, + docElem, + documentIsHTML, + rbuggyQSA, + rbuggyMatches, + matches, + contains, + + // Instance-specific data + expando = "sizzle" + 1 * new Date(), + preferredDoc = window.document, + dirruns = 0, + done = 0, + classCache = createCache(), + tokenCache = createCache(), + compilerCache = createCache(), + nonnativeSelectorCache = createCache(), + sortOrder = function( a, b ) { + if ( a === b ) { + hasDuplicate = true; + } + return 0; + }, + + // Instance methods + hasOwn = ( {} ).hasOwnProperty, + arr = [], + pop = arr.pop, + pushNative = arr.push, + push = arr.push, + slice = arr.slice, + + // Use a stripped-down indexOf as it's faster than native + // https://jsperf.com/thor-indexof-vs-for/5 + indexOf = function( list, elem ) { + var i = 0, + len = list.length; + for ( ; i < len; i++ ) { + if ( list[ i ] === elem ) { + return i; + } + } + return -1; + }, + + booleans = "checked|selected|async|autofocus|autoplay|controls|defer|disabled|hidden|" + + "ismap|loop|multiple|open|readonly|required|scoped", + + // Regular expressions + + // http://www.w3.org/TR/css3-selectors/#whitespace + whitespace = "[\\x20\\t\\r\\n\\f]", + + // https://www.w3.org/TR/css-syntax-3/#ident-token-diagram + identifier = "(?:\\\\[\\da-fA-F]{1,6}" + whitespace + + "?|\\\\[^\\r\\n\\f]|[\\w-]|[^\0-\\x7f])+", + + // Attribute selectors: http://www.w3.org/TR/selectors/#attribute-selectors + attributes = "\\[" + whitespace + "*(" + identifier + ")(?:" + whitespace + + + // Operator (capture 2) + "*([*^$|!~]?=)" + whitespace + + + // "Attribute values must be CSS identifiers [capture 5] + // or strings [capture 3 or capture 4]" + "*(?:'((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\"|(" + identifier + "))|)" + + whitespace + "*\\]", + + pseudos = ":(" + identifier + ")(?:\\((" + + + // To reduce the number of selectors needing tokenize in the preFilter, prefer arguments: + // 1. quoted (capture 3; capture 4 or capture 5) + "('((?:\\\\.|[^\\\\'])*)'|\"((?:\\\\.|[^\\\\\"])*)\")|" + + + // 2. simple (capture 6) + "((?:\\\\.|[^\\\\()[\\]]|" + attributes + ")*)|" + + + // 3. anything else (capture 2) + ".*" + + ")\\)|)", + + // Leading and non-escaped trailing whitespace, capturing some non-whitespace characters preceding the latter + rwhitespace = new RegExp( whitespace + "+", "g" ), + rtrim = new RegExp( "^" + whitespace + "+|((?:^|[^\\\\])(?:\\\\.)*)" + + whitespace + "+$", "g" ), + + rcomma = new RegExp( "^" + whitespace + "*," + whitespace + "*" ), + rcombinators = new RegExp( "^" + whitespace + "*([>+~]|" + whitespace + ")" + whitespace + + "*" ), + rdescend = new RegExp( whitespace + "|>" ), + + rpseudo = new RegExp( pseudos ), + ridentifier = new RegExp( "^" + identifier + "$" ), + + matchExpr = { + "ID": new RegExp( "^#(" + identifier + ")" ), + "CLASS": new RegExp( "^\\.(" + identifier + ")" ), + "TAG": new RegExp( "^(" + identifier + "|[*])" ), + "ATTR": new RegExp( "^" + attributes ), + "PSEUDO": new RegExp( "^" + pseudos ), + "CHILD": new RegExp( "^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\(" + + whitespace + "*(even|odd|(([+-]|)(\\d*)n|)" + whitespace + "*(?:([+-]|)" + + whitespace + "*(\\d+)|))" + whitespace + "*\\)|)", "i" ), + "bool": new RegExp( "^(?:" + booleans + ")$", "i" ), + + // For use in libraries implementing .is() + // We use this for POS matching in `select` + "needsContext": new RegExp( "^" + whitespace + + "*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\(" + whitespace + + "*((?:-\\d)?\\d*)" + whitespace + "*\\)|)(?=[^-]|$)", "i" ) + }, + + rhtml = /HTML$/i, + rinputs = /^(?:input|select|textarea|button)$/i, + rheader = /^h\d$/i, + + rnative = /^[^{]+\{\s*\[native \w/, + + // Easily-parseable/retrievable ID or TAG or CLASS selectors + rquickExpr = /^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/, + + rsibling = /[+~]/, + + // CSS escapes + // http://www.w3.org/TR/CSS21/syndata.html#escaped-characters + runescape = new RegExp( "\\\\[\\da-fA-F]{1,6}" + whitespace + "?|\\\\([^\\r\\n\\f])", "g" ), + funescape = function( escape, nonHex ) { + var high = "0x" + escape.slice( 1 ) - 0x10000; + + return nonHex ? + + // Strip the backslash prefix from a non-hex escape sequence + nonHex : + + // Replace a hexadecimal escape sequence with the encoded Unicode code point + // Support: IE <=11+ + // For values outside the Basic Multilingual Plane (BMP), manually construct a + // surrogate pair + high < 0 ? + String.fromCharCode( high + 0x10000 ) : + String.fromCharCode( high >> 10 | 0xD800, high & 0x3FF | 0xDC00 ); + }, + + // CSS string/identifier serialization + // https://drafts.csswg.org/cssom/#common-serializing-idioms + rcssescape = /([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g, + fcssescape = function( ch, asCodePoint ) { + if ( asCodePoint ) { + + // U+0000 NULL becomes U+FFFD REPLACEMENT CHARACTER + if ( ch === "\0" ) { + return "\uFFFD"; + } + + // Control characters and (dependent upon position) numbers get escaped as code points + return ch.slice( 0, -1 ) + "\\" + + ch.charCodeAt( ch.length - 1 ).toString( 16 ) + " "; + } + + // Other potentially-special ASCII characters get backslash-escaped + return "\\" + ch; + }, + + // Used for iframes + // See setDocument() + // Removing the function wrapper causes a "Permission Denied" + // error in IE + unloadHandler = function() { + setDocument(); + }, + + inDisabledFieldset = addCombinator( + function( elem ) { + return elem.disabled === true && elem.nodeName.toLowerCase() === "fieldset"; + }, + { dir: "parentNode", next: "legend" } + ); + +// Optimize for push.apply( _, NodeList ) +try { + push.apply( + ( arr = slice.call( preferredDoc.childNodes ) ), + preferredDoc.childNodes + ); + + // Support: Android<4.0 + // Detect silently failing push.apply + // eslint-disable-next-line no-unused-expressions + arr[ preferredDoc.childNodes.length ].nodeType; +} catch ( e ) { + push = { apply: arr.length ? + + // Leverage slice if possible + function( target, els ) { + pushNative.apply( target, slice.call( els ) ); + } : + + // Support: IE<9 + // Otherwise append directly + function( target, els ) { + var j = target.length, + i = 0; + + // Can't trust NodeList.length + while ( ( target[ j++ ] = els[ i++ ] ) ) {} + target.length = j - 1; + } + }; +} + +function Sizzle( selector, context, results, seed ) { + var m, i, elem, nid, match, groups, newSelector, + newContext = context && context.ownerDocument, + + // nodeType defaults to 9, since context defaults to document + nodeType = context ? context.nodeType : 9; + + results = results || []; + + // Return early from calls with invalid selector or context + if ( typeof selector !== "string" || !selector || + nodeType !== 1 && nodeType !== 9 && nodeType !== 11 ) { + + return results; + } + + // Try to shortcut find operations (as opposed to filters) in HTML documents + if ( !seed ) { + setDocument( context ); + context = context || document; + + if ( documentIsHTML ) { + + // If the selector is sufficiently simple, try using a "get*By*" DOM method + // (excepting DocumentFragment context, where the methods don't exist) + if ( nodeType !== 11 && ( match = rquickExpr.exec( selector ) ) ) { + + // ID selector + if ( ( m = match[ 1 ] ) ) { + + // Document context + if ( nodeType === 9 ) { + if ( ( elem = context.getElementById( m ) ) ) { + + // Support: IE, Opera, Webkit + // TODO: identify versions + // getElementById can match elements by name instead of ID + if ( elem.id === m ) { + results.push( elem ); + return results; + } + } else { + return results; + } + + // Element context + } else { + + // Support: IE, Opera, Webkit + // TODO: identify versions + // getElementById can match elements by name instead of ID + if ( newContext && ( elem = newContext.getElementById( m ) ) && + contains( context, elem ) && + elem.id === m ) { + + results.push( elem ); + return results; + } + } + + // Type selector + } else if ( match[ 2 ] ) { + push.apply( results, context.getElementsByTagName( selector ) ); + return results; + + // Class selector + } else if ( ( m = match[ 3 ] ) && support.getElementsByClassName && + context.getElementsByClassName ) { + + push.apply( results, context.getElementsByClassName( m ) ); + return results; + } + } + + // Take advantage of querySelectorAll + if ( support.qsa && + !nonnativeSelectorCache[ selector + " " ] && + ( !rbuggyQSA || !rbuggyQSA.test( selector ) ) && + + // Support: IE 8 only + // Exclude object elements + ( nodeType !== 1 || context.nodeName.toLowerCase() !== "object" ) ) { + + newSelector = selector; + newContext = context; + + // qSA considers elements outside a scoping root when evaluating child or + // descendant combinators, which is not what we want. + // In such cases, we work around the behavior by prefixing every selector in the + // list with an ID selector referencing the scope context. + // The technique has to be used as well when a leading combinator is used + // as such selectors are not recognized by querySelectorAll. + // Thanks to Andrew Dupont for this technique. + if ( nodeType === 1 && + ( rdescend.test( selector ) || rcombinators.test( selector ) ) ) { + + // Expand context for sibling selectors + newContext = rsibling.test( selector ) && testContext( context.parentNode ) || + context; + + // We can use :scope instead of the ID hack if the browser + // supports it & if we're not changing the context. + if ( newContext !== context || !support.scope ) { + + // Capture the context ID, setting it first if necessary + if ( ( nid = context.getAttribute( "id" ) ) ) { + nid = nid.replace( rcssescape, fcssescape ); + } else { + context.setAttribute( "id", ( nid = expando ) ); + } + } + + // Prefix every selector in the list + groups = tokenize( selector ); + i = groups.length; + while ( i-- ) { + groups[ i ] = ( nid ? "#" + nid : ":scope" ) + " " + + toSelector( groups[ i ] ); + } + newSelector = groups.join( "," ); + } + + try { + push.apply( results, + newContext.querySelectorAll( newSelector ) + ); + return results; + } catch ( qsaError ) { + nonnativeSelectorCache( selector, true ); + } finally { + if ( nid === expando ) { + context.removeAttribute( "id" ); + } + } + } + } + } + + // All others + return select( selector.replace( rtrim, "$1" ), context, results, seed ); +} + +/** + * Create key-value caches of limited size + * @returns {function(string, object)} Returns the Object data after storing it on itself with + * property name the (space-suffixed) string and (if the cache is larger than Expr.cacheLength) + * deleting the oldest entry + */ +function createCache() { + var keys = []; + + function cache( key, value ) { + + // Use (key + " ") to avoid collision with native prototype properties (see Issue #157) + if ( keys.push( key + " " ) > Expr.cacheLength ) { + + // Only keep the most recent entries + delete cache[ keys.shift() ]; + } + return ( cache[ key + " " ] = value ); + } + return cache; +} + +/** + * Mark a function for special use by Sizzle + * @param {Function} fn The function to mark + */ +function markFunction( fn ) { + fn[ expando ] = true; + return fn; +} + +/** + * Support testing using an element + * @param {Function} fn Passed the created element and returns a boolean result + */ +function assert( fn ) { + var el = document.createElement( "fieldset" ); + + try { + return !!fn( el ); + } catch ( e ) { + return false; + } finally { + + // Remove from its parent by default + if ( el.parentNode ) { + el.parentNode.removeChild( el ); + } + + // release memory in IE + el = null; + } +} + +/** + * Adds the same handler for all of the specified attrs + * @param {String} attrs Pipe-separated list of attributes + * @param {Function} handler The method that will be applied + */ +function addHandle( attrs, handler ) { + var arr = attrs.split( "|" ), + i = arr.length; + + while ( i-- ) { + Expr.attrHandle[ arr[ i ] ] = handler; + } +} + +/** + * Checks document order of two siblings + * @param {Element} a + * @param {Element} b + * @returns {Number} Returns less than 0 if a precedes b, greater than 0 if a follows b + */ +function siblingCheck( a, b ) { + var cur = b && a, + diff = cur && a.nodeType === 1 && b.nodeType === 1 && + a.sourceIndex - b.sourceIndex; + + // Use IE sourceIndex if available on both nodes + if ( diff ) { + return diff; + } + + // Check if b follows a + if ( cur ) { + while ( ( cur = cur.nextSibling ) ) { + if ( cur === b ) { + return -1; + } + } + } + + return a ? 1 : -1; +} + +/** + * Returns a function to use in pseudos for input types + * @param {String} type + */ +function createInputPseudo( type ) { + return function( elem ) { + var name = elem.nodeName.toLowerCase(); + return name === "input" && elem.type === type; + }; +} + +/** + * Returns a function to use in pseudos for buttons + * @param {String} type + */ +function createButtonPseudo( type ) { + return function( elem ) { + var name = elem.nodeName.toLowerCase(); + return ( name === "input" || name === "button" ) && elem.type === type; + }; +} + +/** + * Returns a function to use in pseudos for :enabled/:disabled + * @param {Boolean} disabled true for :disabled; false for :enabled + */ +function createDisabledPseudo( disabled ) { + + // Known :disabled false positives: fieldset[disabled] > legend:nth-of-type(n+2) :can-disable + return function( elem ) { + + // Only certain elements can match :enabled or :disabled + // https://html.spec.whatwg.org/multipage/scripting.html#selector-enabled + // https://html.spec.whatwg.org/multipage/scripting.html#selector-disabled + if ( "form" in elem ) { + + // Check for inherited disabledness on relevant non-disabled elements: + // * listed form-associated elements in a disabled fieldset + // https://html.spec.whatwg.org/multipage/forms.html#category-listed + // https://html.spec.whatwg.org/multipage/forms.html#concept-fe-disabled + // * option elements in a disabled optgroup + // https://html.spec.whatwg.org/multipage/forms.html#concept-option-disabled + // All such elements have a "form" property. + if ( elem.parentNode && elem.disabled === false ) { + + // Option elements defer to a parent optgroup if present + if ( "label" in elem ) { + if ( "label" in elem.parentNode ) { + return elem.parentNode.disabled === disabled; + } else { + return elem.disabled === disabled; + } + } + + // Support: IE 6 - 11 + // Use the isDisabled shortcut property to check for disabled fieldset ancestors + return elem.isDisabled === disabled || + + // Where there is no isDisabled, check manually + /* jshint -W018 */ + elem.isDisabled !== !disabled && + inDisabledFieldset( elem ) === disabled; + } + + return elem.disabled === disabled; + + // Try to winnow out elements that can't be disabled before trusting the disabled property. + // Some victims get caught in our net (label, legend, menu, track), but it shouldn't + // even exist on them, let alone have a boolean value. + } else if ( "label" in elem ) { + return elem.disabled === disabled; + } + + // Remaining elements are neither :enabled nor :disabled + return false; + }; +} + +/** + * Returns a function to use in pseudos for positionals + * @param {Function} fn + */ +function createPositionalPseudo( fn ) { + return markFunction( function( argument ) { + argument = +argument; + return markFunction( function( seed, matches ) { + var j, + matchIndexes = fn( [], seed.length, argument ), + i = matchIndexes.length; + + // Match elements found at the specified indexes + while ( i-- ) { + if ( seed[ ( j = matchIndexes[ i ] ) ] ) { + seed[ j ] = !( matches[ j ] = seed[ j ] ); + } + } + } ); + } ); +} + +/** + * Checks a node for validity as a Sizzle context + * @param {Element|Object=} context + * @returns {Element|Object|Boolean} The input node if acceptable, otherwise a falsy value + */ +function testContext( context ) { + return context && typeof context.getElementsByTagName !== "undefined" && context; +} + +// Expose support vars for convenience +support = Sizzle.support = {}; + +/** + * Detects XML nodes + * @param {Element|Object} elem An element or a document + * @returns {Boolean} True iff elem is a non-HTML XML node + */ +isXML = Sizzle.isXML = function( elem ) { + var namespace = elem.namespaceURI, + docElem = ( elem.ownerDocument || elem ).documentElement; + + // Support: IE <=8 + // Assume HTML when documentElement doesn't yet exist, such as inside loading iframes + // https://bugs.jquery.com/ticket/4833 + return !rhtml.test( namespace || docElem && docElem.nodeName || "HTML" ); +}; + +/** + * Sets document-related variables once based on the current document + * @param {Element|Object} [doc] An element or document object to use to set the document + * @returns {Object} Returns the current document + */ +setDocument = Sizzle.setDocument = function( node ) { + var hasCompare, subWindow, + doc = node ? node.ownerDocument || node : preferredDoc; + + // Return early if doc is invalid or already selected + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( doc == document || doc.nodeType !== 9 || !doc.documentElement ) { + return document; + } + + // Update global variables + document = doc; + docElem = document.documentElement; + documentIsHTML = !isXML( document ); + + // Support: IE 9 - 11+, Edge 12 - 18+ + // Accessing iframe documents after unload throws "permission denied" errors (jQuery #13936) + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( preferredDoc != document && + ( subWindow = document.defaultView ) && subWindow.top !== subWindow ) { + + // Support: IE 11, Edge + if ( subWindow.addEventListener ) { + subWindow.addEventListener( "unload", unloadHandler, false ); + + // Support: IE 9 - 10 only + } else if ( subWindow.attachEvent ) { + subWindow.attachEvent( "onunload", unloadHandler ); + } + } + + // Support: IE 8 - 11+, Edge 12 - 18+, Chrome <=16 - 25 only, Firefox <=3.6 - 31 only, + // Safari 4 - 5 only, Opera <=11.6 - 12.x only + // IE/Edge & older browsers don't support the :scope pseudo-class. + // Support: Safari 6.0 only + // Safari 6.0 supports :scope but it's an alias of :root there. + support.scope = assert( function( el ) { + docElem.appendChild( el ).appendChild( document.createElement( "div" ) ); + return typeof el.querySelectorAll !== "undefined" && + !el.querySelectorAll( ":scope fieldset div" ).length; + } ); + + /* Attributes + ---------------------------------------------------------------------- */ + + // Support: IE<8 + // Verify that getAttribute really returns attributes and not properties + // (excepting IE8 booleans) + support.attributes = assert( function( el ) { + el.className = "i"; + return !el.getAttribute( "className" ); + } ); + + /* getElement(s)By* + ---------------------------------------------------------------------- */ + + // Check if getElementsByTagName("*") returns only elements + support.getElementsByTagName = assert( function( el ) { + el.appendChild( document.createComment( "" ) ); + return !el.getElementsByTagName( "*" ).length; + } ); + + // Support: IE<9 + support.getElementsByClassName = rnative.test( document.getElementsByClassName ); + + // Support: IE<10 + // Check if getElementById returns elements by name + // The broken getElementById methods don't pick up programmatically-set names, + // so use a roundabout getElementsByName test + support.getById = assert( function( el ) { + docElem.appendChild( el ).id = expando; + return !document.getElementsByName || !document.getElementsByName( expando ).length; + } ); + + // ID filter and find + if ( support.getById ) { + Expr.filter[ "ID" ] = function( id ) { + var attrId = id.replace( runescape, funescape ); + return function( elem ) { + return elem.getAttribute( "id" ) === attrId; + }; + }; + Expr.find[ "ID" ] = function( id, context ) { + if ( typeof context.getElementById !== "undefined" && documentIsHTML ) { + var elem = context.getElementById( id ); + return elem ? [ elem ] : []; + } + }; + } else { + Expr.filter[ "ID" ] = function( id ) { + var attrId = id.replace( runescape, funescape ); + return function( elem ) { + var node = typeof elem.getAttributeNode !== "undefined" && + elem.getAttributeNode( "id" ); + return node && node.value === attrId; + }; + }; + + // Support: IE 6 - 7 only + // getElementById is not reliable as a find shortcut + Expr.find[ "ID" ] = function( id, context ) { + if ( typeof context.getElementById !== "undefined" && documentIsHTML ) { + var node, i, elems, + elem = context.getElementById( id ); + + if ( elem ) { + + // Verify the id attribute + node = elem.getAttributeNode( "id" ); + if ( node && node.value === id ) { + return [ elem ]; + } + + // Fall back on getElementsByName + elems = context.getElementsByName( id ); + i = 0; + while ( ( elem = elems[ i++ ] ) ) { + node = elem.getAttributeNode( "id" ); + if ( node && node.value === id ) { + return [ elem ]; + } + } + } + + return []; + } + }; + } + + // Tag + Expr.find[ "TAG" ] = support.getElementsByTagName ? + function( tag, context ) { + if ( typeof context.getElementsByTagName !== "undefined" ) { + return context.getElementsByTagName( tag ); + + // DocumentFragment nodes don't have gEBTN + } else if ( support.qsa ) { + return context.querySelectorAll( tag ); + } + } : + + function( tag, context ) { + var elem, + tmp = [], + i = 0, + + // By happy coincidence, a (broken) gEBTN appears on DocumentFragment nodes too + results = context.getElementsByTagName( tag ); + + // Filter out possible comments + if ( tag === "*" ) { + while ( ( elem = results[ i++ ] ) ) { + if ( elem.nodeType === 1 ) { + tmp.push( elem ); + } + } + + return tmp; + } + return results; + }; + + // Class + Expr.find[ "CLASS" ] = support.getElementsByClassName && function( className, context ) { + if ( typeof context.getElementsByClassName !== "undefined" && documentIsHTML ) { + return context.getElementsByClassName( className ); + } + }; + + /* QSA/matchesSelector + ---------------------------------------------------------------------- */ + + // QSA and matchesSelector support + + // matchesSelector(:active) reports false when true (IE9/Opera 11.5) + rbuggyMatches = []; + + // qSa(:focus) reports false when true (Chrome 21) + // We allow this because of a bug in IE8/9 that throws an error + // whenever `document.activeElement` is accessed on an iframe + // So, we allow :focus to pass through QSA all the time to avoid the IE error + // See https://bugs.jquery.com/ticket/13378 + rbuggyQSA = []; + + if ( ( support.qsa = rnative.test( document.querySelectorAll ) ) ) { + + // Build QSA regex + // Regex strategy adopted from Diego Perini + assert( function( el ) { + + var input; + + // Select is set to empty string on purpose + // This is to test IE's treatment of not explicitly + // setting a boolean content attribute, + // since its presence should be enough + // https://bugs.jquery.com/ticket/12359 + docElem.appendChild( el ).innerHTML = "" + + ""; + + // Support: IE8, Opera 11-12.16 + // Nothing should be selected when empty strings follow ^= or $= or *= + // The test attribute must be unknown in Opera but "safe" for WinRT + // https://msdn.microsoft.com/en-us/library/ie/hh465388.aspx#attribute_section + if ( el.querySelectorAll( "[msallowcapture^='']" ).length ) { + rbuggyQSA.push( "[*^$]=" + whitespace + "*(?:''|\"\")" ); + } + + // Support: IE8 + // Boolean attributes and "value" are not treated correctly + if ( !el.querySelectorAll( "[selected]" ).length ) { + rbuggyQSA.push( "\\[" + whitespace + "*(?:value|" + booleans + ")" ); + } + + // Support: Chrome<29, Android<4.4, Safari<7.0+, iOS<7.0+, PhantomJS<1.9.8+ + if ( !el.querySelectorAll( "[id~=" + expando + "-]" ).length ) { + rbuggyQSA.push( "~=" ); + } + + // Support: IE 11+, Edge 15 - 18+ + // IE 11/Edge don't find elements on a `[name='']` query in some cases. + // Adding a temporary attribute to the document before the selection works + // around the issue. + // Interestingly, IE 10 & older don't seem to have the issue. + input = document.createElement( "input" ); + input.setAttribute( "name", "" ); + el.appendChild( input ); + if ( !el.querySelectorAll( "[name='']" ).length ) { + rbuggyQSA.push( "\\[" + whitespace + "*name" + whitespace + "*=" + + whitespace + "*(?:''|\"\")" ); + } + + // Webkit/Opera - :checked should return selected option elements + // http://www.w3.org/TR/2011/REC-css3-selectors-20110929/#checked + // IE8 throws error here and will not see later tests + if ( !el.querySelectorAll( ":checked" ).length ) { + rbuggyQSA.push( ":checked" ); + } + + // Support: Safari 8+, iOS 8+ + // https://bugs.webkit.org/show_bug.cgi?id=136851 + // In-page `selector#id sibling-combinator selector` fails + if ( !el.querySelectorAll( "a#" + expando + "+*" ).length ) { + rbuggyQSA.push( ".#.+[+~]" ); + } + + // Support: Firefox <=3.6 - 5 only + // Old Firefox doesn't throw on a badly-escaped identifier. + el.querySelectorAll( "\\\f" ); + rbuggyQSA.push( "[\\r\\n\\f]" ); + } ); + + assert( function( el ) { + el.innerHTML = "" + + ""; + + // Support: Windows 8 Native Apps + // The type and name attributes are restricted during .innerHTML assignment + var input = document.createElement( "input" ); + input.setAttribute( "type", "hidden" ); + el.appendChild( input ).setAttribute( "name", "D" ); + + // Support: IE8 + // Enforce case-sensitivity of name attribute + if ( el.querySelectorAll( "[name=d]" ).length ) { + rbuggyQSA.push( "name" + whitespace + "*[*^$|!~]?=" ); + } + + // FF 3.5 - :enabled/:disabled and hidden elements (hidden elements are still enabled) + // IE8 throws error here and will not see later tests + if ( el.querySelectorAll( ":enabled" ).length !== 2 ) { + rbuggyQSA.push( ":enabled", ":disabled" ); + } + + // Support: IE9-11+ + // IE's :disabled selector does not pick up the children of disabled fieldsets + docElem.appendChild( el ).disabled = true; + if ( el.querySelectorAll( ":disabled" ).length !== 2 ) { + rbuggyQSA.push( ":enabled", ":disabled" ); + } + + // Support: Opera 10 - 11 only + // Opera 10-11 does not throw on post-comma invalid pseudos + el.querySelectorAll( "*,:x" ); + rbuggyQSA.push( ",.*:" ); + } ); + } + + if ( ( support.matchesSelector = rnative.test( ( matches = docElem.matches || + docElem.webkitMatchesSelector || + docElem.mozMatchesSelector || + docElem.oMatchesSelector || + docElem.msMatchesSelector ) ) ) ) { + + assert( function( el ) { + + // Check to see if it's possible to do matchesSelector + // on a disconnected node (IE 9) + support.disconnectedMatch = matches.call( el, "*" ); + + // This should fail with an exception + // Gecko does not error, returns false instead + matches.call( el, "[s!='']:x" ); + rbuggyMatches.push( "!=", pseudos ); + } ); + } + + rbuggyQSA = rbuggyQSA.length && new RegExp( rbuggyQSA.join( "|" ) ); + rbuggyMatches = rbuggyMatches.length && new RegExp( rbuggyMatches.join( "|" ) ); + + /* Contains + ---------------------------------------------------------------------- */ + hasCompare = rnative.test( docElem.compareDocumentPosition ); + + // Element contains another + // Purposefully self-exclusive + // As in, an element does not contain itself + contains = hasCompare || rnative.test( docElem.contains ) ? + function( a, b ) { + var adown = a.nodeType === 9 ? a.documentElement : a, + bup = b && b.parentNode; + return a === bup || !!( bup && bup.nodeType === 1 && ( + adown.contains ? + adown.contains( bup ) : + a.compareDocumentPosition && a.compareDocumentPosition( bup ) & 16 + ) ); + } : + function( a, b ) { + if ( b ) { + while ( ( b = b.parentNode ) ) { + if ( b === a ) { + return true; + } + } + } + return false; + }; + + /* Sorting + ---------------------------------------------------------------------- */ + + // Document order sorting + sortOrder = hasCompare ? + function( a, b ) { + + // Flag for duplicate removal + if ( a === b ) { + hasDuplicate = true; + return 0; + } + + // Sort on method existence if only one input has compareDocumentPosition + var compare = !a.compareDocumentPosition - !b.compareDocumentPosition; + if ( compare ) { + return compare; + } + + // Calculate position if both inputs belong to the same document + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + compare = ( a.ownerDocument || a ) == ( b.ownerDocument || b ) ? + a.compareDocumentPosition( b ) : + + // Otherwise we know they are disconnected + 1; + + // Disconnected nodes + if ( compare & 1 || + ( !support.sortDetached && b.compareDocumentPosition( a ) === compare ) ) { + + // Choose the first element that is related to our preferred document + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( a == document || a.ownerDocument == preferredDoc && + contains( preferredDoc, a ) ) { + return -1; + } + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( b == document || b.ownerDocument == preferredDoc && + contains( preferredDoc, b ) ) { + return 1; + } + + // Maintain original order + return sortInput ? + ( indexOf( sortInput, a ) - indexOf( sortInput, b ) ) : + 0; + } + + return compare & 4 ? -1 : 1; + } : + function( a, b ) { + + // Exit early if the nodes are identical + if ( a === b ) { + hasDuplicate = true; + return 0; + } + + var cur, + i = 0, + aup = a.parentNode, + bup = b.parentNode, + ap = [ a ], + bp = [ b ]; + + // Parentless nodes are either documents or disconnected + if ( !aup || !bup ) { + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + /* eslint-disable eqeqeq */ + return a == document ? -1 : + b == document ? 1 : + /* eslint-enable eqeqeq */ + aup ? -1 : + bup ? 1 : + sortInput ? + ( indexOf( sortInput, a ) - indexOf( sortInput, b ) ) : + 0; + + // If the nodes are siblings, we can do a quick check + } else if ( aup === bup ) { + return siblingCheck( a, b ); + } + + // Otherwise we need full lists of their ancestors for comparison + cur = a; + while ( ( cur = cur.parentNode ) ) { + ap.unshift( cur ); + } + cur = b; + while ( ( cur = cur.parentNode ) ) { + bp.unshift( cur ); + } + + // Walk down the tree looking for a discrepancy + while ( ap[ i ] === bp[ i ] ) { + i++; + } + + return i ? + + // Do a sibling check if the nodes have a common ancestor + siblingCheck( ap[ i ], bp[ i ] ) : + + // Otherwise nodes in our document sort first + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + /* eslint-disable eqeqeq */ + ap[ i ] == preferredDoc ? -1 : + bp[ i ] == preferredDoc ? 1 : + /* eslint-enable eqeqeq */ + 0; + }; + + return document; +}; + +Sizzle.matches = function( expr, elements ) { + return Sizzle( expr, null, null, elements ); +}; + +Sizzle.matchesSelector = function( elem, expr ) { + setDocument( elem ); + + if ( support.matchesSelector && documentIsHTML && + !nonnativeSelectorCache[ expr + " " ] && + ( !rbuggyMatches || !rbuggyMatches.test( expr ) ) && + ( !rbuggyQSA || !rbuggyQSA.test( expr ) ) ) { + + try { + var ret = matches.call( elem, expr ); + + // IE 9's matchesSelector returns false on disconnected nodes + if ( ret || support.disconnectedMatch || + + // As well, disconnected nodes are said to be in a document + // fragment in IE 9 + elem.document && elem.document.nodeType !== 11 ) { + return ret; + } + } catch ( e ) { + nonnativeSelectorCache( expr, true ); + } + } + + return Sizzle( expr, document, null, [ elem ] ).length > 0; +}; + +Sizzle.contains = function( context, elem ) { + + // Set document vars if needed + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( ( context.ownerDocument || context ) != document ) { + setDocument( context ); + } + return contains( context, elem ); +}; + +Sizzle.attr = function( elem, name ) { + + // Set document vars if needed + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( ( elem.ownerDocument || elem ) != document ) { + setDocument( elem ); + } + + var fn = Expr.attrHandle[ name.toLowerCase() ], + + // Don't get fooled by Object.prototype properties (jQuery #13807) + val = fn && hasOwn.call( Expr.attrHandle, name.toLowerCase() ) ? + fn( elem, name, !documentIsHTML ) : + undefined; + + return val !== undefined ? + val : + support.attributes || !documentIsHTML ? + elem.getAttribute( name ) : + ( val = elem.getAttributeNode( name ) ) && val.specified ? + val.value : + null; +}; + +Sizzle.escape = function( sel ) { + return ( sel + "" ).replace( rcssescape, fcssescape ); +}; + +Sizzle.error = function( msg ) { + throw new Error( "Syntax error, unrecognized expression: " + msg ); +}; + +/** + * Document sorting and removing duplicates + * @param {ArrayLike} results + */ +Sizzle.uniqueSort = function( results ) { + var elem, + duplicates = [], + j = 0, + i = 0; + + // Unless we *know* we can detect duplicates, assume their presence + hasDuplicate = !support.detectDuplicates; + sortInput = !support.sortStable && results.slice( 0 ); + results.sort( sortOrder ); + + if ( hasDuplicate ) { + while ( ( elem = results[ i++ ] ) ) { + if ( elem === results[ i ] ) { + j = duplicates.push( i ); + } + } + while ( j-- ) { + results.splice( duplicates[ j ], 1 ); + } + } + + // Clear input after sorting to release objects + // See https://github.com/jquery/sizzle/pull/225 + sortInput = null; + + return results; +}; + +/** + * Utility function for retrieving the text value of an array of DOM nodes + * @param {Array|Element} elem + */ +getText = Sizzle.getText = function( elem ) { + var node, + ret = "", + i = 0, + nodeType = elem.nodeType; + + if ( !nodeType ) { + + // If no nodeType, this is expected to be an array + while ( ( node = elem[ i++ ] ) ) { + + // Do not traverse comment nodes + ret += getText( node ); + } + } else if ( nodeType === 1 || nodeType === 9 || nodeType === 11 ) { + + // Use textContent for elements + // innerText usage removed for consistency of new lines (jQuery #11153) + if ( typeof elem.textContent === "string" ) { + return elem.textContent; + } else { + + // Traverse its children + for ( elem = elem.firstChild; elem; elem = elem.nextSibling ) { + ret += getText( elem ); + } + } + } else if ( nodeType === 3 || nodeType === 4 ) { + return elem.nodeValue; + } + + // Do not include comment or processing instruction nodes + + return ret; +}; + +Expr = Sizzle.selectors = { + + // Can be adjusted by the user + cacheLength: 50, + + createPseudo: markFunction, + + match: matchExpr, + + attrHandle: {}, + + find: {}, + + relative: { + ">": { dir: "parentNode", first: true }, + " ": { dir: "parentNode" }, + "+": { dir: "previousSibling", first: true }, + "~": { dir: "previousSibling" } + }, + + preFilter: { + "ATTR": function( match ) { + match[ 1 ] = match[ 1 ].replace( runescape, funescape ); + + // Move the given value to match[3] whether quoted or unquoted + match[ 3 ] = ( match[ 3 ] || match[ 4 ] || + match[ 5 ] || "" ).replace( runescape, funescape ); + + if ( match[ 2 ] === "~=" ) { + match[ 3 ] = " " + match[ 3 ] + " "; + } + + return match.slice( 0, 4 ); + }, + + "CHILD": function( match ) { + + /* matches from matchExpr["CHILD"] + 1 type (only|nth|...) + 2 what (child|of-type) + 3 argument (even|odd|\d*|\d*n([+-]\d+)?|...) + 4 xn-component of xn+y argument ([+-]?\d*n|) + 5 sign of xn-component + 6 x of xn-component + 7 sign of y-component + 8 y of y-component + */ + match[ 1 ] = match[ 1 ].toLowerCase(); + + if ( match[ 1 ].slice( 0, 3 ) === "nth" ) { + + // nth-* requires argument + if ( !match[ 3 ] ) { + Sizzle.error( match[ 0 ] ); + } + + // numeric x and y parameters for Expr.filter.CHILD + // remember that false/true cast respectively to 0/1 + match[ 4 ] = +( match[ 4 ] ? + match[ 5 ] + ( match[ 6 ] || 1 ) : + 2 * ( match[ 3 ] === "even" || match[ 3 ] === "odd" ) ); + match[ 5 ] = +( ( match[ 7 ] + match[ 8 ] ) || match[ 3 ] === "odd" ); + + // other types prohibit arguments + } else if ( match[ 3 ] ) { + Sizzle.error( match[ 0 ] ); + } + + return match; + }, + + "PSEUDO": function( match ) { + var excess, + unquoted = !match[ 6 ] && match[ 2 ]; + + if ( matchExpr[ "CHILD" ].test( match[ 0 ] ) ) { + return null; + } + + // Accept quoted arguments as-is + if ( match[ 3 ] ) { + match[ 2 ] = match[ 4 ] || match[ 5 ] || ""; + + // Strip excess characters from unquoted arguments + } else if ( unquoted && rpseudo.test( unquoted ) && + + // Get excess from tokenize (recursively) + ( excess = tokenize( unquoted, true ) ) && + + // advance to the next closing parenthesis + ( excess = unquoted.indexOf( ")", unquoted.length - excess ) - unquoted.length ) ) { + + // excess is a negative index + match[ 0 ] = match[ 0 ].slice( 0, excess ); + match[ 2 ] = unquoted.slice( 0, excess ); + } + + // Return only captures needed by the pseudo filter method (type and argument) + return match.slice( 0, 3 ); + } + }, + + filter: { + + "TAG": function( nodeNameSelector ) { + var nodeName = nodeNameSelector.replace( runescape, funescape ).toLowerCase(); + return nodeNameSelector === "*" ? + function() { + return true; + } : + function( elem ) { + return elem.nodeName && elem.nodeName.toLowerCase() === nodeName; + }; + }, + + "CLASS": function( className ) { + var pattern = classCache[ className + " " ]; + + return pattern || + ( pattern = new RegExp( "(^|" + whitespace + + ")" + className + "(" + whitespace + "|$)" ) ) && classCache( + className, function( elem ) { + return pattern.test( + typeof elem.className === "string" && elem.className || + typeof elem.getAttribute !== "undefined" && + elem.getAttribute( "class" ) || + "" + ); + } ); + }, + + "ATTR": function( name, operator, check ) { + return function( elem ) { + var result = Sizzle.attr( elem, name ); + + if ( result == null ) { + return operator === "!="; + } + if ( !operator ) { + return true; + } + + result += ""; + + /* eslint-disable max-len */ + + return operator === "=" ? result === check : + operator === "!=" ? result !== check : + operator === "^=" ? check && result.indexOf( check ) === 0 : + operator === "*=" ? check && result.indexOf( check ) > -1 : + operator === "$=" ? check && result.slice( -check.length ) === check : + operator === "~=" ? ( " " + result.replace( rwhitespace, " " ) + " " ).indexOf( check ) > -1 : + operator === "|=" ? result === check || result.slice( 0, check.length + 1 ) === check + "-" : + false; + /* eslint-enable max-len */ + + }; + }, + + "CHILD": function( type, what, _argument, first, last ) { + var simple = type.slice( 0, 3 ) !== "nth", + forward = type.slice( -4 ) !== "last", + ofType = what === "of-type"; + + return first === 1 && last === 0 ? + + // Shortcut for :nth-*(n) + function( elem ) { + return !!elem.parentNode; + } : + + function( elem, _context, xml ) { + var cache, uniqueCache, outerCache, node, nodeIndex, start, + dir = simple !== forward ? "nextSibling" : "previousSibling", + parent = elem.parentNode, + name = ofType && elem.nodeName.toLowerCase(), + useCache = !xml && !ofType, + diff = false; + + if ( parent ) { + + // :(first|last|only)-(child|of-type) + if ( simple ) { + while ( dir ) { + node = elem; + while ( ( node = node[ dir ] ) ) { + if ( ofType ? + node.nodeName.toLowerCase() === name : + node.nodeType === 1 ) { + + return false; + } + } + + // Reverse direction for :only-* (if we haven't yet done so) + start = dir = type === "only" && !start && "nextSibling"; + } + return true; + } + + start = [ forward ? parent.firstChild : parent.lastChild ]; + + // non-xml :nth-child(...) stores cache data on `parent` + if ( forward && useCache ) { + + // Seek `elem` from a previously-cached index + + // ...in a gzip-friendly way + node = parent; + outerCache = node[ expando ] || ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + cache = uniqueCache[ type ] || []; + nodeIndex = cache[ 0 ] === dirruns && cache[ 1 ]; + diff = nodeIndex && cache[ 2 ]; + node = nodeIndex && parent.childNodes[ nodeIndex ]; + + while ( ( node = ++nodeIndex && node && node[ dir ] || + + // Fallback to seeking `elem` from the start + ( diff = nodeIndex = 0 ) || start.pop() ) ) { + + // When found, cache indexes on `parent` and break + if ( node.nodeType === 1 && ++diff && node === elem ) { + uniqueCache[ type ] = [ dirruns, nodeIndex, diff ]; + break; + } + } + + } else { + + // Use previously-cached element index if available + if ( useCache ) { + + // ...in a gzip-friendly way + node = elem; + outerCache = node[ expando ] || ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + cache = uniqueCache[ type ] || []; + nodeIndex = cache[ 0 ] === dirruns && cache[ 1 ]; + diff = nodeIndex; + } + + // xml :nth-child(...) + // or :nth-last-child(...) or :nth(-last)?-of-type(...) + if ( diff === false ) { + + // Use the same loop as above to seek `elem` from the start + while ( ( node = ++nodeIndex && node && node[ dir ] || + ( diff = nodeIndex = 0 ) || start.pop() ) ) { + + if ( ( ofType ? + node.nodeName.toLowerCase() === name : + node.nodeType === 1 ) && + ++diff ) { + + // Cache the index of each encountered element + if ( useCache ) { + outerCache = node[ expando ] || + ( node[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ node.uniqueID ] || + ( outerCache[ node.uniqueID ] = {} ); + + uniqueCache[ type ] = [ dirruns, diff ]; + } + + if ( node === elem ) { + break; + } + } + } + } + } + + // Incorporate the offset, then check against cycle size + diff -= last; + return diff === first || ( diff % first === 0 && diff / first >= 0 ); + } + }; + }, + + "PSEUDO": function( pseudo, argument ) { + + // pseudo-class names are case-insensitive + // http://www.w3.org/TR/selectors/#pseudo-classes + // Prioritize by case sensitivity in case custom pseudos are added with uppercase letters + // Remember that setFilters inherits from pseudos + var args, + fn = Expr.pseudos[ pseudo ] || Expr.setFilters[ pseudo.toLowerCase() ] || + Sizzle.error( "unsupported pseudo: " + pseudo ); + + // The user may use createPseudo to indicate that + // arguments are needed to create the filter function + // just as Sizzle does + if ( fn[ expando ] ) { + return fn( argument ); + } + + // But maintain support for old signatures + if ( fn.length > 1 ) { + args = [ pseudo, pseudo, "", argument ]; + return Expr.setFilters.hasOwnProperty( pseudo.toLowerCase() ) ? + markFunction( function( seed, matches ) { + var idx, + matched = fn( seed, argument ), + i = matched.length; + while ( i-- ) { + idx = indexOf( seed, matched[ i ] ); + seed[ idx ] = !( matches[ idx ] = matched[ i ] ); + } + } ) : + function( elem ) { + return fn( elem, 0, args ); + }; + } + + return fn; + } + }, + + pseudos: { + + // Potentially complex pseudos + "not": markFunction( function( selector ) { + + // Trim the selector passed to compile + // to avoid treating leading and trailing + // spaces as combinators + var input = [], + results = [], + matcher = compile( selector.replace( rtrim, "$1" ) ); + + return matcher[ expando ] ? + markFunction( function( seed, matches, _context, xml ) { + var elem, + unmatched = matcher( seed, null, xml, [] ), + i = seed.length; + + // Match elements unmatched by `matcher` + while ( i-- ) { + if ( ( elem = unmatched[ i ] ) ) { + seed[ i ] = !( matches[ i ] = elem ); + } + } + } ) : + function( elem, _context, xml ) { + input[ 0 ] = elem; + matcher( input, null, xml, results ); + + // Don't keep the element (issue #299) + input[ 0 ] = null; + return !results.pop(); + }; + } ), + + "has": markFunction( function( selector ) { + return function( elem ) { + return Sizzle( selector, elem ).length > 0; + }; + } ), + + "contains": markFunction( function( text ) { + text = text.replace( runescape, funescape ); + return function( elem ) { + return ( elem.textContent || getText( elem ) ).indexOf( text ) > -1; + }; + } ), + + // "Whether an element is represented by a :lang() selector + // is based solely on the element's language value + // being equal to the identifier C, + // or beginning with the identifier C immediately followed by "-". + // The matching of C against the element's language value is performed case-insensitively. + // The identifier C does not have to be a valid language name." + // http://www.w3.org/TR/selectors/#lang-pseudo + "lang": markFunction( function( lang ) { + + // lang value must be a valid identifier + if ( !ridentifier.test( lang || "" ) ) { + Sizzle.error( "unsupported lang: " + lang ); + } + lang = lang.replace( runescape, funescape ).toLowerCase(); + return function( elem ) { + var elemLang; + do { + if ( ( elemLang = documentIsHTML ? + elem.lang : + elem.getAttribute( "xml:lang" ) || elem.getAttribute( "lang" ) ) ) { + + elemLang = elemLang.toLowerCase(); + return elemLang === lang || elemLang.indexOf( lang + "-" ) === 0; + } + } while ( ( elem = elem.parentNode ) && elem.nodeType === 1 ); + return false; + }; + } ), + + // Miscellaneous + "target": function( elem ) { + var hash = window.location && window.location.hash; + return hash && hash.slice( 1 ) === elem.id; + }, + + "root": function( elem ) { + return elem === docElem; + }, + + "focus": function( elem ) { + return elem === document.activeElement && + ( !document.hasFocus || document.hasFocus() ) && + !!( elem.type || elem.href || ~elem.tabIndex ); + }, + + // Boolean properties + "enabled": createDisabledPseudo( false ), + "disabled": createDisabledPseudo( true ), + + "checked": function( elem ) { + + // In CSS3, :checked should return both checked and selected elements + // http://www.w3.org/TR/2011/REC-css3-selectors-20110929/#checked + var nodeName = elem.nodeName.toLowerCase(); + return ( nodeName === "input" && !!elem.checked ) || + ( nodeName === "option" && !!elem.selected ); + }, + + "selected": function( elem ) { + + // Accessing this property makes selected-by-default + // options in Safari work properly + if ( elem.parentNode ) { + // eslint-disable-next-line no-unused-expressions + elem.parentNode.selectedIndex; + } + + return elem.selected === true; + }, + + // Contents + "empty": function( elem ) { + + // http://www.w3.org/TR/selectors/#empty-pseudo + // :empty is negated by element (1) or content nodes (text: 3; cdata: 4; entity ref: 5), + // but not by others (comment: 8; processing instruction: 7; etc.) + // nodeType < 6 works because attributes (2) do not appear as children + for ( elem = elem.firstChild; elem; elem = elem.nextSibling ) { + if ( elem.nodeType < 6 ) { + return false; + } + } + return true; + }, + + "parent": function( elem ) { + return !Expr.pseudos[ "empty" ]( elem ); + }, + + // Element/input types + "header": function( elem ) { + return rheader.test( elem.nodeName ); + }, + + "input": function( elem ) { + return rinputs.test( elem.nodeName ); + }, + + "button": function( elem ) { + var name = elem.nodeName.toLowerCase(); + return name === "input" && elem.type === "button" || name === "button"; + }, + + "text": function( elem ) { + var attr; + return elem.nodeName.toLowerCase() === "input" && + elem.type === "text" && + + // Support: IE<8 + // New HTML5 attribute values (e.g., "search") appear with elem.type === "text" + ( ( attr = elem.getAttribute( "type" ) ) == null || + attr.toLowerCase() === "text" ); + }, + + // Position-in-collection + "first": createPositionalPseudo( function() { + return [ 0 ]; + } ), + + "last": createPositionalPseudo( function( _matchIndexes, length ) { + return [ length - 1 ]; + } ), + + "eq": createPositionalPseudo( function( _matchIndexes, length, argument ) { + return [ argument < 0 ? argument + length : argument ]; + } ), + + "even": createPositionalPseudo( function( matchIndexes, length ) { + var i = 0; + for ( ; i < length; i += 2 ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "odd": createPositionalPseudo( function( matchIndexes, length ) { + var i = 1; + for ( ; i < length; i += 2 ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "lt": createPositionalPseudo( function( matchIndexes, length, argument ) { + var i = argument < 0 ? + argument + length : + argument > length ? + length : + argument; + for ( ; --i >= 0; ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ), + + "gt": createPositionalPseudo( function( matchIndexes, length, argument ) { + var i = argument < 0 ? argument + length : argument; + for ( ; ++i < length; ) { + matchIndexes.push( i ); + } + return matchIndexes; + } ) + } +}; + +Expr.pseudos[ "nth" ] = Expr.pseudos[ "eq" ]; + +// Add button/input type pseudos +for ( i in { radio: true, checkbox: true, file: true, password: true, image: true } ) { + Expr.pseudos[ i ] = createInputPseudo( i ); +} +for ( i in { submit: true, reset: true } ) { + Expr.pseudos[ i ] = createButtonPseudo( i ); +} + +// Easy API for creating new setFilters +function setFilters() {} +setFilters.prototype = Expr.filters = Expr.pseudos; +Expr.setFilters = new setFilters(); + +tokenize = Sizzle.tokenize = function( selector, parseOnly ) { + var matched, match, tokens, type, + soFar, groups, preFilters, + cached = tokenCache[ selector + " " ]; + + if ( cached ) { + return parseOnly ? 0 : cached.slice( 0 ); + } + + soFar = selector; + groups = []; + preFilters = Expr.preFilter; + + while ( soFar ) { + + // Comma and first run + if ( !matched || ( match = rcomma.exec( soFar ) ) ) { + if ( match ) { + + // Don't consume trailing commas as valid + soFar = soFar.slice( match[ 0 ].length ) || soFar; + } + groups.push( ( tokens = [] ) ); + } + + matched = false; + + // Combinators + if ( ( match = rcombinators.exec( soFar ) ) ) { + matched = match.shift(); + tokens.push( { + value: matched, + + // Cast descendant combinators to space + type: match[ 0 ].replace( rtrim, " " ) + } ); + soFar = soFar.slice( matched.length ); + } + + // Filters + for ( type in Expr.filter ) { + if ( ( match = matchExpr[ type ].exec( soFar ) ) && ( !preFilters[ type ] || + ( match = preFilters[ type ]( match ) ) ) ) { + matched = match.shift(); + tokens.push( { + value: matched, + type: type, + matches: match + } ); + soFar = soFar.slice( matched.length ); + } + } + + if ( !matched ) { + break; + } + } + + // Return the length of the invalid excess + // if we're just parsing + // Otherwise, throw an error or return tokens + return parseOnly ? + soFar.length : + soFar ? + Sizzle.error( selector ) : + + // Cache the tokens + tokenCache( selector, groups ).slice( 0 ); +}; + +function toSelector( tokens ) { + var i = 0, + len = tokens.length, + selector = ""; + for ( ; i < len; i++ ) { + selector += tokens[ i ].value; + } + return selector; +} + +function addCombinator( matcher, combinator, base ) { + var dir = combinator.dir, + skip = combinator.next, + key = skip || dir, + checkNonElements = base && key === "parentNode", + doneName = done++; + + return combinator.first ? + + // Check against closest ancestor/preceding element + function( elem, context, xml ) { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + return matcher( elem, context, xml ); + } + } + return false; + } : + + // Check against all ancestor/preceding elements + function( elem, context, xml ) { + var oldCache, uniqueCache, outerCache, + newCache = [ dirruns, doneName ]; + + // We can't set arbitrary data on XML nodes, so they don't benefit from combinator caching + if ( xml ) { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + if ( matcher( elem, context, xml ) ) { + return true; + } + } + } + } else { + while ( ( elem = elem[ dir ] ) ) { + if ( elem.nodeType === 1 || checkNonElements ) { + outerCache = elem[ expando ] || ( elem[ expando ] = {} ); + + // Support: IE <9 only + // Defend against cloned attroperties (jQuery gh-1709) + uniqueCache = outerCache[ elem.uniqueID ] || + ( outerCache[ elem.uniqueID ] = {} ); + + if ( skip && skip === elem.nodeName.toLowerCase() ) { + elem = elem[ dir ] || elem; + } else if ( ( oldCache = uniqueCache[ key ] ) && + oldCache[ 0 ] === dirruns && oldCache[ 1 ] === doneName ) { + + // Assign to newCache so results back-propagate to previous elements + return ( newCache[ 2 ] = oldCache[ 2 ] ); + } else { + + // Reuse newcache so results back-propagate to previous elements + uniqueCache[ key ] = newCache; + + // A match means we're done; a fail means we have to keep checking + if ( ( newCache[ 2 ] = matcher( elem, context, xml ) ) ) { + return true; + } + } + } + } + } + return false; + }; +} + +function elementMatcher( matchers ) { + return matchers.length > 1 ? + function( elem, context, xml ) { + var i = matchers.length; + while ( i-- ) { + if ( !matchers[ i ]( elem, context, xml ) ) { + return false; + } + } + return true; + } : + matchers[ 0 ]; +} + +function multipleContexts( selector, contexts, results ) { + var i = 0, + len = contexts.length; + for ( ; i < len; i++ ) { + Sizzle( selector, contexts[ i ], results ); + } + return results; +} + +function condense( unmatched, map, filter, context, xml ) { + var elem, + newUnmatched = [], + i = 0, + len = unmatched.length, + mapped = map != null; + + for ( ; i < len; i++ ) { + if ( ( elem = unmatched[ i ] ) ) { + if ( !filter || filter( elem, context, xml ) ) { + newUnmatched.push( elem ); + if ( mapped ) { + map.push( i ); + } + } + } + } + + return newUnmatched; +} + +function setMatcher( preFilter, selector, matcher, postFilter, postFinder, postSelector ) { + if ( postFilter && !postFilter[ expando ] ) { + postFilter = setMatcher( postFilter ); + } + if ( postFinder && !postFinder[ expando ] ) { + postFinder = setMatcher( postFinder, postSelector ); + } + return markFunction( function( seed, results, context, xml ) { + var temp, i, elem, + preMap = [], + postMap = [], + preexisting = results.length, + + // Get initial elements from seed or context + elems = seed || multipleContexts( + selector || "*", + context.nodeType ? [ context ] : context, + [] + ), + + // Prefilter to get matcher input, preserving a map for seed-results synchronization + matcherIn = preFilter && ( seed || !selector ) ? + condense( elems, preMap, preFilter, context, xml ) : + elems, + + matcherOut = matcher ? + + // If we have a postFinder, or filtered seed, or non-seed postFilter or preexisting results, + postFinder || ( seed ? preFilter : preexisting || postFilter ) ? + + // ...intermediate processing is necessary + [] : + + // ...otherwise use results directly + results : + matcherIn; + + // Find primary matches + if ( matcher ) { + matcher( matcherIn, matcherOut, context, xml ); + } + + // Apply postFilter + if ( postFilter ) { + temp = condense( matcherOut, postMap ); + postFilter( temp, [], context, xml ); + + // Un-match failing elements by moving them back to matcherIn + i = temp.length; + while ( i-- ) { + if ( ( elem = temp[ i ] ) ) { + matcherOut[ postMap[ i ] ] = !( matcherIn[ postMap[ i ] ] = elem ); + } + } + } + + if ( seed ) { + if ( postFinder || preFilter ) { + if ( postFinder ) { + + // Get the final matcherOut by condensing this intermediate into postFinder contexts + temp = []; + i = matcherOut.length; + while ( i-- ) { + if ( ( elem = matcherOut[ i ] ) ) { + + // Restore matcherIn since elem is not yet a final match + temp.push( ( matcherIn[ i ] = elem ) ); + } + } + postFinder( null, ( matcherOut = [] ), temp, xml ); + } + + // Move matched elements from seed to results to keep them synchronized + i = matcherOut.length; + while ( i-- ) { + if ( ( elem = matcherOut[ i ] ) && + ( temp = postFinder ? indexOf( seed, elem ) : preMap[ i ] ) > -1 ) { + + seed[ temp ] = !( results[ temp ] = elem ); + } + } + } + + // Add elements to results, through postFinder if defined + } else { + matcherOut = condense( + matcherOut === results ? + matcherOut.splice( preexisting, matcherOut.length ) : + matcherOut + ); + if ( postFinder ) { + postFinder( null, results, matcherOut, xml ); + } else { + push.apply( results, matcherOut ); + } + } + } ); +} + +function matcherFromTokens( tokens ) { + var checkContext, matcher, j, + len = tokens.length, + leadingRelative = Expr.relative[ tokens[ 0 ].type ], + implicitRelative = leadingRelative || Expr.relative[ " " ], + i = leadingRelative ? 1 : 0, + + // The foundational matcher ensures that elements are reachable from top-level context(s) + matchContext = addCombinator( function( elem ) { + return elem === checkContext; + }, implicitRelative, true ), + matchAnyContext = addCombinator( function( elem ) { + return indexOf( checkContext, elem ) > -1; + }, implicitRelative, true ), + matchers = [ function( elem, context, xml ) { + var ret = ( !leadingRelative && ( xml || context !== outermostContext ) ) || ( + ( checkContext = context ).nodeType ? + matchContext( elem, context, xml ) : + matchAnyContext( elem, context, xml ) ); + + // Avoid hanging onto element (issue #299) + checkContext = null; + return ret; + } ]; + + for ( ; i < len; i++ ) { + if ( ( matcher = Expr.relative[ tokens[ i ].type ] ) ) { + matchers = [ addCombinator( elementMatcher( matchers ), matcher ) ]; + } else { + matcher = Expr.filter[ tokens[ i ].type ].apply( null, tokens[ i ].matches ); + + // Return special upon seeing a positional matcher + if ( matcher[ expando ] ) { + + // Find the next relative operator (if any) for proper handling + j = ++i; + for ( ; j < len; j++ ) { + if ( Expr.relative[ tokens[ j ].type ] ) { + break; + } + } + return setMatcher( + i > 1 && elementMatcher( matchers ), + i > 1 && toSelector( + + // If the preceding token was a descendant combinator, insert an implicit any-element `*` + tokens + .slice( 0, i - 1 ) + .concat( { value: tokens[ i - 2 ].type === " " ? "*" : "" } ) + ).replace( rtrim, "$1" ), + matcher, + i < j && matcherFromTokens( tokens.slice( i, j ) ), + j < len && matcherFromTokens( ( tokens = tokens.slice( j ) ) ), + j < len && toSelector( tokens ) + ); + } + matchers.push( matcher ); + } + } + + return elementMatcher( matchers ); +} + +function matcherFromGroupMatchers( elementMatchers, setMatchers ) { + var bySet = setMatchers.length > 0, + byElement = elementMatchers.length > 0, + superMatcher = function( seed, context, xml, results, outermost ) { + var elem, j, matcher, + matchedCount = 0, + i = "0", + unmatched = seed && [], + setMatched = [], + contextBackup = outermostContext, + + // We must always have either seed elements or outermost context + elems = seed || byElement && Expr.find[ "TAG" ]( "*", outermost ), + + // Use integer dirruns iff this is the outermost matcher + dirrunsUnique = ( dirruns += contextBackup == null ? 1 : Math.random() || 0.1 ), + len = elems.length; + + if ( outermost ) { + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + outermostContext = context == document || context || outermost; + } + + // Add elements passing elementMatchers directly to results + // Support: IE<9, Safari + // Tolerate NodeList properties (IE: "length"; Safari: ) matching elements by id + for ( ; i !== len && ( elem = elems[ i ] ) != null; i++ ) { + if ( byElement && elem ) { + j = 0; + + // Support: IE 11+, Edge 17 - 18+ + // IE/Edge sometimes throw a "Permission denied" error when strict-comparing + // two documents; shallow comparisons work. + // eslint-disable-next-line eqeqeq + if ( !context && elem.ownerDocument != document ) { + setDocument( elem ); + xml = !documentIsHTML; + } + while ( ( matcher = elementMatchers[ j++ ] ) ) { + if ( matcher( elem, context || document, xml ) ) { + results.push( elem ); + break; + } + } + if ( outermost ) { + dirruns = dirrunsUnique; + } + } + + // Track unmatched elements for set filters + if ( bySet ) { + + // They will have gone through all possible matchers + if ( ( elem = !matcher && elem ) ) { + matchedCount--; + } + + // Lengthen the array for every element, matched or not + if ( seed ) { + unmatched.push( elem ); + } + } + } + + // `i` is now the count of elements visited above, and adding it to `matchedCount` + // makes the latter nonnegative. + matchedCount += i; + + // Apply set filters to unmatched elements + // NOTE: This can be skipped if there are no unmatched elements (i.e., `matchedCount` + // equals `i`), unless we didn't visit _any_ elements in the above loop because we have + // no element matchers and no seed. + // Incrementing an initially-string "0" `i` allows `i` to remain a string only in that + // case, which will result in a "00" `matchedCount` that differs from `i` but is also + // numerically zero. + if ( bySet && i !== matchedCount ) { + j = 0; + while ( ( matcher = setMatchers[ j++ ] ) ) { + matcher( unmatched, setMatched, context, xml ); + } + + if ( seed ) { + + // Reintegrate element matches to eliminate the need for sorting + if ( matchedCount > 0 ) { + while ( i-- ) { + if ( !( unmatched[ i ] || setMatched[ i ] ) ) { + setMatched[ i ] = pop.call( results ); + } + } + } + + // Discard index placeholder values to get only actual matches + setMatched = condense( setMatched ); + } + + // Add matches to results + push.apply( results, setMatched ); + + // Seedless set matches succeeding multiple successful matchers stipulate sorting + if ( outermost && !seed && setMatched.length > 0 && + ( matchedCount + setMatchers.length ) > 1 ) { + + Sizzle.uniqueSort( results ); + } + } + + // Override manipulation of globals by nested matchers + if ( outermost ) { + dirruns = dirrunsUnique; + outermostContext = contextBackup; + } + + return unmatched; + }; + + return bySet ? + markFunction( superMatcher ) : + superMatcher; +} + +compile = Sizzle.compile = function( selector, match /* Internal Use Only */ ) { + var i, + setMatchers = [], + elementMatchers = [], + cached = compilerCache[ selector + " " ]; + + if ( !cached ) { + + // Generate a function of recursive functions that can be used to check each element + if ( !match ) { + match = tokenize( selector ); + } + i = match.length; + while ( i-- ) { + cached = matcherFromTokens( match[ i ] ); + if ( cached[ expando ] ) { + setMatchers.push( cached ); + } else { + elementMatchers.push( cached ); + } + } + + // Cache the compiled function + cached = compilerCache( + selector, + matcherFromGroupMatchers( elementMatchers, setMatchers ) + ); + + // Save selector and tokenization + cached.selector = selector; + } + return cached; +}; + +/** + * A low-level selection function that works with Sizzle's compiled + * selector functions + * @param {String|Function} selector A selector or a pre-compiled + * selector function built with Sizzle.compile + * @param {Element} context + * @param {Array} [results] + * @param {Array} [seed] A set of elements to match against + */ +select = Sizzle.select = function( selector, context, results, seed ) { + var i, tokens, token, type, find, + compiled = typeof selector === "function" && selector, + match = !seed && tokenize( ( selector = compiled.selector || selector ) ); + + results = results || []; + + // Try to minimize operations if there is only one selector in the list and no seed + // (the latter of which guarantees us context) + if ( match.length === 1 ) { + + // Reduce context if the leading compound selector is an ID + tokens = match[ 0 ] = match[ 0 ].slice( 0 ); + if ( tokens.length > 2 && ( token = tokens[ 0 ] ).type === "ID" && + context.nodeType === 9 && documentIsHTML && Expr.relative[ tokens[ 1 ].type ] ) { + + context = ( Expr.find[ "ID" ]( token.matches[ 0 ] + .replace( runescape, funescape ), context ) || [] )[ 0 ]; + if ( !context ) { + return results; + + // Precompiled matchers will still verify ancestry, so step up a level + } else if ( compiled ) { + context = context.parentNode; + } + + selector = selector.slice( tokens.shift().value.length ); + } + + // Fetch a seed set for right-to-left matching + i = matchExpr[ "needsContext" ].test( selector ) ? 0 : tokens.length; + while ( i-- ) { + token = tokens[ i ]; + + // Abort if we hit a combinator + if ( Expr.relative[ ( type = token.type ) ] ) { + break; + } + if ( ( find = Expr.find[ type ] ) ) { + + // Search, expanding context for leading sibling combinators + if ( ( seed = find( + token.matches[ 0 ].replace( runescape, funescape ), + rsibling.test( tokens[ 0 ].type ) && testContext( context.parentNode ) || + context + ) ) ) { + + // If seed is empty or no tokens remain, we can return early + tokens.splice( i, 1 ); + selector = seed.length && toSelector( tokens ); + if ( !selector ) { + push.apply( results, seed ); + return results; + } + + break; + } + } + } + } + + // Compile and execute a filtering function if one is not provided + // Provide `match` to avoid retokenization if we modified the selector above + ( compiled || compile( selector, match ) )( + seed, + context, + !documentIsHTML, + results, + !context || rsibling.test( selector ) && testContext( context.parentNode ) || context + ); + return results; +}; + +// One-time assignments + +// Sort stability +support.sortStable = expando.split( "" ).sort( sortOrder ).join( "" ) === expando; + +// Support: Chrome 14-35+ +// Always assume duplicates if they aren't passed to the comparison function +support.detectDuplicates = !!hasDuplicate; + +// Initialize against the default document +setDocument(); + +// Support: Webkit<537.32 - Safari 6.0.3/Chrome 25 (fixed in Chrome 27) +// Detached nodes confoundingly follow *each other* +support.sortDetached = assert( function( el ) { + + // Should return 1, but returns 4 (following) + return el.compareDocumentPosition( document.createElement( "fieldset" ) ) & 1; +} ); + +// Support: IE<8 +// Prevent attribute/property "interpolation" +// https://msdn.microsoft.com/en-us/library/ms536429%28VS.85%29.aspx +if ( !assert( function( el ) { + el.innerHTML = ""; + return el.firstChild.getAttribute( "href" ) === "#"; +} ) ) { + addHandle( "type|href|height|width", function( elem, name, isXML ) { + if ( !isXML ) { + return elem.getAttribute( name, name.toLowerCase() === "type" ? 1 : 2 ); + } + } ); +} + +// Support: IE<9 +// Use defaultValue in place of getAttribute("value") +if ( !support.attributes || !assert( function( el ) { + el.innerHTML = ""; + el.firstChild.setAttribute( "value", "" ); + return el.firstChild.getAttribute( "value" ) === ""; +} ) ) { + addHandle( "value", function( elem, _name, isXML ) { + if ( !isXML && elem.nodeName.toLowerCase() === "input" ) { + return elem.defaultValue; + } + } ); +} + +// Support: IE<9 +// Use getAttributeNode to fetch booleans when getAttribute lies +if ( !assert( function( el ) { + return el.getAttribute( "disabled" ) == null; +} ) ) { + addHandle( booleans, function( elem, name, isXML ) { + var val; + if ( !isXML ) { + return elem[ name ] === true ? name.toLowerCase() : + ( val = elem.getAttributeNode( name ) ) && val.specified ? + val.value : + null; + } + } ); +} + +return Sizzle; + +} )( window ); + + + +jQuery.find = Sizzle; +jQuery.expr = Sizzle.selectors; + +// Deprecated +jQuery.expr[ ":" ] = jQuery.expr.pseudos; +jQuery.uniqueSort = jQuery.unique = Sizzle.uniqueSort; +jQuery.text = Sizzle.getText; +jQuery.isXMLDoc = Sizzle.isXML; +jQuery.contains = Sizzle.contains; +jQuery.escapeSelector = Sizzle.escape; + + + + +var dir = function( elem, dir, until ) { + var matched = [], + truncate = until !== undefined; + + while ( ( elem = elem[ dir ] ) && elem.nodeType !== 9 ) { + if ( elem.nodeType === 1 ) { + if ( truncate && jQuery( elem ).is( until ) ) { + break; + } + matched.push( elem ); + } + } + return matched; +}; + + +var siblings = function( n, elem ) { + var matched = []; + + for ( ; n; n = n.nextSibling ) { + if ( n.nodeType === 1 && n !== elem ) { + matched.push( n ); + } + } + + return matched; +}; + + +var rneedsContext = jQuery.expr.match.needsContext; + + + +function nodeName( elem, name ) { + + return elem.nodeName && elem.nodeName.toLowerCase() === name.toLowerCase(); + +}; +var rsingleTag = ( /^<([a-z][^\/\0>:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i ); + + + +// Implement the identical functionality for filter and not +function winnow( elements, qualifier, not ) { + if ( isFunction( qualifier ) ) { + return jQuery.grep( elements, function( elem, i ) { + return !!qualifier.call( elem, i, elem ) !== not; + } ); + } + + // Single element + if ( qualifier.nodeType ) { + return jQuery.grep( elements, function( elem ) { + return ( elem === qualifier ) !== not; + } ); + } + + // Arraylike of elements (jQuery, arguments, Array) + if ( typeof qualifier !== "string" ) { + return jQuery.grep( elements, function( elem ) { + return ( indexOf.call( qualifier, elem ) > -1 ) !== not; + } ); + } + + // Filtered directly for both simple and complex selectors + return jQuery.filter( qualifier, elements, not ); +} + +jQuery.filter = function( expr, elems, not ) { + var elem = elems[ 0 ]; + + if ( not ) { + expr = ":not(" + expr + ")"; + } + + if ( elems.length === 1 && elem.nodeType === 1 ) { + return jQuery.find.matchesSelector( elem, expr ) ? [ elem ] : []; + } + + return jQuery.find.matches( expr, jQuery.grep( elems, function( elem ) { + return elem.nodeType === 1; + } ) ); +}; + +jQuery.fn.extend( { + find: function( selector ) { + var i, ret, + len = this.length, + self = this; + + if ( typeof selector !== "string" ) { + return this.pushStack( jQuery( selector ).filter( function() { + for ( i = 0; i < len; i++ ) { + if ( jQuery.contains( self[ i ], this ) ) { + return true; + } + } + } ) ); + } + + ret = this.pushStack( [] ); + + for ( i = 0; i < len; i++ ) { + jQuery.find( selector, self[ i ], ret ); + } + + return len > 1 ? jQuery.uniqueSort( ret ) : ret; + }, + filter: function( selector ) { + return this.pushStack( winnow( this, selector || [], false ) ); + }, + not: function( selector ) { + return this.pushStack( winnow( this, selector || [], true ) ); + }, + is: function( selector ) { + return !!winnow( + this, + + // If this is a positional/relative selector, check membership in the returned set + // so $("p:first").is("p:last") won't return true for a doc with two "p". + typeof selector === "string" && rneedsContext.test( selector ) ? + jQuery( selector ) : + selector || [], + false + ).length; + } +} ); + + +// Initialize a jQuery object + + +// A central reference to the root jQuery(document) +var rootjQuery, + + // A simple way to check for HTML strings + // Prioritize #id over to avoid XSS via location.hash (#9521) + // Strict HTML recognition (#11290: must start with <) + // Shortcut simple #id case for speed + rquickExpr = /^(?:\s*(<[\w\W]+>)[^>]*|#([\w-]+))$/, + + init = jQuery.fn.init = function( selector, context, root ) { + var match, elem; + + // HANDLE: $(""), $(null), $(undefined), $(false) + if ( !selector ) { + return this; + } + + // Method init() accepts an alternate rootjQuery + // so migrate can support jQuery.sub (gh-2101) + root = root || rootjQuery; + + // Handle HTML strings + if ( typeof selector === "string" ) { + if ( selector[ 0 ] === "<" && + selector[ selector.length - 1 ] === ">" && + selector.length >= 3 ) { + + // Assume that strings that start and end with <> are HTML and skip the regex check + match = [ null, selector, null ]; + + } else { + match = rquickExpr.exec( selector ); + } + + // Match html or make sure no context is specified for #id + if ( match && ( match[ 1 ] || !context ) ) { + + // HANDLE: $(html) -> $(array) + if ( match[ 1 ] ) { + context = context instanceof jQuery ? context[ 0 ] : context; + + // Option to run scripts is true for back-compat + // Intentionally let the error be thrown if parseHTML is not present + jQuery.merge( this, jQuery.parseHTML( + match[ 1 ], + context && context.nodeType ? context.ownerDocument || context : document, + true + ) ); + + // HANDLE: $(html, props) + if ( rsingleTag.test( match[ 1 ] ) && jQuery.isPlainObject( context ) ) { + for ( match in context ) { + + // Properties of context are called as methods if possible + if ( isFunction( this[ match ] ) ) { + this[ match ]( context[ match ] ); + + // ...and otherwise set as attributes + } else { + this.attr( match, context[ match ] ); + } + } + } + + return this; + + // HANDLE: $(#id) + } else { + elem = document.getElementById( match[ 2 ] ); + + if ( elem ) { + + // Inject the element directly into the jQuery object + this[ 0 ] = elem; + this.length = 1; + } + return this; + } + + // HANDLE: $(expr, $(...)) + } else if ( !context || context.jquery ) { + return ( context || root ).find( selector ); + + // HANDLE: $(expr, context) + // (which is just equivalent to: $(context).find(expr) + } else { + return this.constructor( context ).find( selector ); + } + + // HANDLE: $(DOMElement) + } else if ( selector.nodeType ) { + this[ 0 ] = selector; + this.length = 1; + return this; + + // HANDLE: $(function) + // Shortcut for document ready + } else if ( isFunction( selector ) ) { + return root.ready !== undefined ? + root.ready( selector ) : + + // Execute immediately if ready is not present + selector( jQuery ); + } + + return jQuery.makeArray( selector, this ); + }; + +// Give the init function the jQuery prototype for later instantiation +init.prototype = jQuery.fn; + +// Initialize central reference +rootjQuery = jQuery( document ); + + +var rparentsprev = /^(?:parents|prev(?:Until|All))/, + + // Methods guaranteed to produce a unique set when starting from a unique set + guaranteedUnique = { + children: true, + contents: true, + next: true, + prev: true + }; + +jQuery.fn.extend( { + has: function( target ) { + var targets = jQuery( target, this ), + l = targets.length; + + return this.filter( function() { + var i = 0; + for ( ; i < l; i++ ) { + if ( jQuery.contains( this, targets[ i ] ) ) { + return true; + } + } + } ); + }, + + closest: function( selectors, context ) { + var cur, + i = 0, + l = this.length, + matched = [], + targets = typeof selectors !== "string" && jQuery( selectors ); + + // Positional selectors never match, since there's no _selection_ context + if ( !rneedsContext.test( selectors ) ) { + for ( ; i < l; i++ ) { + for ( cur = this[ i ]; cur && cur !== context; cur = cur.parentNode ) { + + // Always skip document fragments + if ( cur.nodeType < 11 && ( targets ? + targets.index( cur ) > -1 : + + // Don't pass non-elements to Sizzle + cur.nodeType === 1 && + jQuery.find.matchesSelector( cur, selectors ) ) ) { + + matched.push( cur ); + break; + } + } + } + } + + return this.pushStack( matched.length > 1 ? jQuery.uniqueSort( matched ) : matched ); + }, + + // Determine the position of an element within the set + index: function( elem ) { + + // No argument, return index in parent + if ( !elem ) { + return ( this[ 0 ] && this[ 0 ].parentNode ) ? this.first().prevAll().length : -1; + } + + // Index in selector + if ( typeof elem === "string" ) { + return indexOf.call( jQuery( elem ), this[ 0 ] ); + } + + // Locate the position of the desired element + return indexOf.call( this, + + // If it receives a jQuery object, the first element is used + elem.jquery ? elem[ 0 ] : elem + ); + }, + + add: function( selector, context ) { + return this.pushStack( + jQuery.uniqueSort( + jQuery.merge( this.get(), jQuery( selector, context ) ) + ) + ); + }, + + addBack: function( selector ) { + return this.add( selector == null ? + this.prevObject : this.prevObject.filter( selector ) + ); + } +} ); + +function sibling( cur, dir ) { + while ( ( cur = cur[ dir ] ) && cur.nodeType !== 1 ) {} + return cur; +} + +jQuery.each( { + parent: function( elem ) { + var parent = elem.parentNode; + return parent && parent.nodeType !== 11 ? parent : null; + }, + parents: function( elem ) { + return dir( elem, "parentNode" ); + }, + parentsUntil: function( elem, _i, until ) { + return dir( elem, "parentNode", until ); + }, + next: function( elem ) { + return sibling( elem, "nextSibling" ); + }, + prev: function( elem ) { + return sibling( elem, "previousSibling" ); + }, + nextAll: function( elem ) { + return dir( elem, "nextSibling" ); + }, + prevAll: function( elem ) { + return dir( elem, "previousSibling" ); + }, + nextUntil: function( elem, _i, until ) { + return dir( elem, "nextSibling", until ); + }, + prevUntil: function( elem, _i, until ) { + return dir( elem, "previousSibling", until ); + }, + siblings: function( elem ) { + return siblings( ( elem.parentNode || {} ).firstChild, elem ); + }, + children: function( elem ) { + return siblings( elem.firstChild ); + }, + contents: function( elem ) { + if ( elem.contentDocument != null && + + // Support: IE 11+ + // elements with no `data` attribute has an object + // `contentDocument` with a `null` prototype. + getProto( elem.contentDocument ) ) { + + return elem.contentDocument; + } + + // Support: IE 9 - 11 only, iOS 7 only, Android Browser <=4.3 only + // Treat the template element as a regular one in browsers that + // don't support it. + if ( nodeName( elem, "template" ) ) { + elem = elem.content || elem; + } + + return jQuery.merge( [], elem.childNodes ); + } +}, function( name, fn ) { + jQuery.fn[ name ] = function( until, selector ) { + var matched = jQuery.map( this, fn, until ); + + if ( name.slice( -5 ) !== "Until" ) { + selector = until; + } + + if ( selector && typeof selector === "string" ) { + matched = jQuery.filter( selector, matched ); + } + + if ( this.length > 1 ) { + + // Remove duplicates + if ( !guaranteedUnique[ name ] ) { + jQuery.uniqueSort( matched ); + } + + // Reverse order for parents* and prev-derivatives + if ( rparentsprev.test( name ) ) { + matched.reverse(); + } + } + + return this.pushStack( matched ); + }; +} ); +var rnothtmlwhite = ( /[^\x20\t\r\n\f]+/g ); + + + +// Convert String-formatted options into Object-formatted ones +function createOptions( options ) { + var object = {}; + jQuery.each( options.match( rnothtmlwhite ) || [], function( _, flag ) { + object[ flag ] = true; + } ); + return object; +} + +/* + * Create a callback list using the following parameters: + * + * options: an optional list of space-separated options that will change how + * the callback list behaves or a more traditional option object + * + * By default a callback list will act like an event callback list and can be + * "fired" multiple times. + * + * Possible options: + * + * once: will ensure the callback list can only be fired once (like a Deferred) + * + * memory: will keep track of previous values and will call any callback added + * after the list has been fired right away with the latest "memorized" + * values (like a Deferred) + * + * unique: will ensure a callback can only be added once (no duplicate in the list) + * + * stopOnFalse: interrupt callings when a callback returns false + * + */ +jQuery.Callbacks = function( options ) { + + // Convert options from String-formatted to Object-formatted if needed + // (we check in cache first) + options = typeof options === "string" ? + createOptions( options ) : + jQuery.extend( {}, options ); + + var // Flag to know if list is currently firing + firing, + + // Last fire value for non-forgettable lists + memory, + + // Flag to know if list was already fired + fired, + + // Flag to prevent firing + locked, + + // Actual callback list + list = [], + + // Queue of execution data for repeatable lists + queue = [], + + // Index of currently firing callback (modified by add/remove as needed) + firingIndex = -1, + + // Fire callbacks + fire = function() { + + // Enforce single-firing + locked = locked || options.once; + + // Execute callbacks for all pending executions, + // respecting firingIndex overrides and runtime changes + fired = firing = true; + for ( ; queue.length; firingIndex = -1 ) { + memory = queue.shift(); + while ( ++firingIndex < list.length ) { + + // Run callback and check for early termination + if ( list[ firingIndex ].apply( memory[ 0 ], memory[ 1 ] ) === false && + options.stopOnFalse ) { + + // Jump to end and forget the data so .add doesn't re-fire + firingIndex = list.length; + memory = false; + } + } + } + + // Forget the data if we're done with it + if ( !options.memory ) { + memory = false; + } + + firing = false; + + // Clean up if we're done firing for good + if ( locked ) { + + // Keep an empty list if we have data for future add calls + if ( memory ) { + list = []; + + // Otherwise, this object is spent + } else { + list = ""; + } + } + }, + + // Actual Callbacks object + self = { + + // Add a callback or a collection of callbacks to the list + add: function() { + if ( list ) { + + // If we have memory from a past run, we should fire after adding + if ( memory && !firing ) { + firingIndex = list.length - 1; + queue.push( memory ); + } + + ( function add( args ) { + jQuery.each( args, function( _, arg ) { + if ( isFunction( arg ) ) { + if ( !options.unique || !self.has( arg ) ) { + list.push( arg ); + } + } else if ( arg && arg.length && toType( arg ) !== "string" ) { + + // Inspect recursively + add( arg ); + } + } ); + } )( arguments ); + + if ( memory && !firing ) { + fire(); + } + } + return this; + }, + + // Remove a callback from the list + remove: function() { + jQuery.each( arguments, function( _, arg ) { + var index; + while ( ( index = jQuery.inArray( arg, list, index ) ) > -1 ) { + list.splice( index, 1 ); + + // Handle firing indexes + if ( index <= firingIndex ) { + firingIndex--; + } + } + } ); + return this; + }, + + // Check if a given callback is in the list. + // If no argument is given, return whether or not list has callbacks attached. + has: function( fn ) { + return fn ? + jQuery.inArray( fn, list ) > -1 : + list.length > 0; + }, + + // Remove all callbacks from the list + empty: function() { + if ( list ) { + list = []; + } + return this; + }, + + // Disable .fire and .add + // Abort any current/pending executions + // Clear all callbacks and values + disable: function() { + locked = queue = []; + list = memory = ""; + return this; + }, + disabled: function() { + return !list; + }, + + // Disable .fire + // Also disable .add unless we have memory (since it would have no effect) + // Abort any pending executions + lock: function() { + locked = queue = []; + if ( !memory && !firing ) { + list = memory = ""; + } + return this; + }, + locked: function() { + return !!locked; + }, + + // Call all callbacks with the given context and arguments + fireWith: function( context, args ) { + if ( !locked ) { + args = args || []; + args = [ context, args.slice ? args.slice() : args ]; + queue.push( args ); + if ( !firing ) { + fire(); + } + } + return this; + }, + + // Call all the callbacks with the given arguments + fire: function() { + self.fireWith( this, arguments ); + return this; + }, + + // To know if the callbacks have already been called at least once + fired: function() { + return !!fired; + } + }; + + return self; +}; + + +function Identity( v ) { + return v; +} +function Thrower( ex ) { + throw ex; +} + +function adoptValue( value, resolve, reject, noValue ) { + var method; + + try { + + // Check for promise aspect first to privilege synchronous behavior + if ( value && isFunction( ( method = value.promise ) ) ) { + method.call( value ).done( resolve ).fail( reject ); + + // Other thenables + } else if ( value && isFunction( ( method = value.then ) ) ) { + method.call( value, resolve, reject ); + + // Other non-thenables + } else { + + // Control `resolve` arguments by letting Array#slice cast boolean `noValue` to integer: + // * false: [ value ].slice( 0 ) => resolve( value ) + // * true: [ value ].slice( 1 ) => resolve() + resolve.apply( undefined, [ value ].slice( noValue ) ); + } + + // For Promises/A+, convert exceptions into rejections + // Since jQuery.when doesn't unwrap thenables, we can skip the extra checks appearing in + // Deferred#then to conditionally suppress rejection. + } catch ( value ) { + + // Support: Android 4.0 only + // Strict mode functions invoked without .call/.apply get global-object context + reject.apply( undefined, [ value ] ); + } +} + +jQuery.extend( { + + Deferred: function( func ) { + var tuples = [ + + // action, add listener, callbacks, + // ... .then handlers, argument index, [final state] + [ "notify", "progress", jQuery.Callbacks( "memory" ), + jQuery.Callbacks( "memory" ), 2 ], + [ "resolve", "done", jQuery.Callbacks( "once memory" ), + jQuery.Callbacks( "once memory" ), 0, "resolved" ], + [ "reject", "fail", jQuery.Callbacks( "once memory" ), + jQuery.Callbacks( "once memory" ), 1, "rejected" ] + ], + state = "pending", + promise = { + state: function() { + return state; + }, + always: function() { + deferred.done( arguments ).fail( arguments ); + return this; + }, + "catch": function( fn ) { + return promise.then( null, fn ); + }, + + // Keep pipe for back-compat + pipe: function( /* fnDone, fnFail, fnProgress */ ) { + var fns = arguments; + + return jQuery.Deferred( function( newDefer ) { + jQuery.each( tuples, function( _i, tuple ) { + + // Map tuples (progress, done, fail) to arguments (done, fail, progress) + var fn = isFunction( fns[ tuple[ 4 ] ] ) && fns[ tuple[ 4 ] ]; + + // deferred.progress(function() { bind to newDefer or newDefer.notify }) + // deferred.done(function() { bind to newDefer or newDefer.resolve }) + // deferred.fail(function() { bind to newDefer or newDefer.reject }) + deferred[ tuple[ 1 ] ]( function() { + var returned = fn && fn.apply( this, arguments ); + if ( returned && isFunction( returned.promise ) ) { + returned.promise() + .progress( newDefer.notify ) + .done( newDefer.resolve ) + .fail( newDefer.reject ); + } else { + newDefer[ tuple[ 0 ] + "With" ]( + this, + fn ? [ returned ] : arguments + ); + } + } ); + } ); + fns = null; + } ).promise(); + }, + then: function( onFulfilled, onRejected, onProgress ) { + var maxDepth = 0; + function resolve( depth, deferred, handler, special ) { + return function() { + var that = this, + args = arguments, + mightThrow = function() { + var returned, then; + + // Support: Promises/A+ section 2.3.3.3.3 + // https://promisesaplus.com/#point-59 + // Ignore double-resolution attempts + if ( depth < maxDepth ) { + return; + } + + returned = handler.apply( that, args ); + + // Support: Promises/A+ section 2.3.1 + // https://promisesaplus.com/#point-48 + if ( returned === deferred.promise() ) { + throw new TypeError( "Thenable self-resolution" ); + } + + // Support: Promises/A+ sections 2.3.3.1, 3.5 + // https://promisesaplus.com/#point-54 + // https://promisesaplus.com/#point-75 + // Retrieve `then` only once + then = returned && + + // Support: Promises/A+ section 2.3.4 + // https://promisesaplus.com/#point-64 + // Only check objects and functions for thenability + ( typeof returned === "object" || + typeof returned === "function" ) && + returned.then; + + // Handle a returned thenable + if ( isFunction( then ) ) { + + // Special processors (notify) just wait for resolution + if ( special ) { + then.call( + returned, + resolve( maxDepth, deferred, Identity, special ), + resolve( maxDepth, deferred, Thrower, special ) + ); + + // Normal processors (resolve) also hook into progress + } else { + + // ...and disregard older resolution values + maxDepth++; + + then.call( + returned, + resolve( maxDepth, deferred, Identity, special ), + resolve( maxDepth, deferred, Thrower, special ), + resolve( maxDepth, deferred, Identity, + deferred.notifyWith ) + ); + } + + // Handle all other returned values + } else { + + // Only substitute handlers pass on context + // and multiple values (non-spec behavior) + if ( handler !== Identity ) { + that = undefined; + args = [ returned ]; + } + + // Process the value(s) + // Default process is resolve + ( special || deferred.resolveWith )( that, args ); + } + }, + + // Only normal processors (resolve) catch and reject exceptions + process = special ? + mightThrow : + function() { + try { + mightThrow(); + } catch ( e ) { + + if ( jQuery.Deferred.exceptionHook ) { + jQuery.Deferred.exceptionHook( e, + process.stackTrace ); + } + + // Support: Promises/A+ section 2.3.3.3.4.1 + // https://promisesaplus.com/#point-61 + // Ignore post-resolution exceptions + if ( depth + 1 >= maxDepth ) { + + // Only substitute handlers pass on context + // and multiple values (non-spec behavior) + if ( handler !== Thrower ) { + that = undefined; + args = [ e ]; + } + + deferred.rejectWith( that, args ); + } + } + }; + + // Support: Promises/A+ section 2.3.3.3.1 + // https://promisesaplus.com/#point-57 + // Re-resolve promises immediately to dodge false rejection from + // subsequent errors + if ( depth ) { + process(); + } else { + + // Call an optional hook to record the stack, in case of exception + // since it's otherwise lost when execution goes async + if ( jQuery.Deferred.getStackHook ) { + process.stackTrace = jQuery.Deferred.getStackHook(); + } + window.setTimeout( process ); + } + }; + } + + return jQuery.Deferred( function( newDefer ) { + + // progress_handlers.add( ... ) + tuples[ 0 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onProgress ) ? + onProgress : + Identity, + newDefer.notifyWith + ) + ); + + // fulfilled_handlers.add( ... ) + tuples[ 1 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onFulfilled ) ? + onFulfilled : + Identity + ) + ); + + // rejected_handlers.add( ... ) + tuples[ 2 ][ 3 ].add( + resolve( + 0, + newDefer, + isFunction( onRejected ) ? + onRejected : + Thrower + ) + ); + } ).promise(); + }, + + // Get a promise for this deferred + // If obj is provided, the promise aspect is added to the object + promise: function( obj ) { + return obj != null ? jQuery.extend( obj, promise ) : promise; + } + }, + deferred = {}; + + // Add list-specific methods + jQuery.each( tuples, function( i, tuple ) { + var list = tuple[ 2 ], + stateString = tuple[ 5 ]; + + // promise.progress = list.add + // promise.done = list.add + // promise.fail = list.add + promise[ tuple[ 1 ] ] = list.add; + + // Handle state + if ( stateString ) { + list.add( + function() { + + // state = "resolved" (i.e., fulfilled) + // state = "rejected" + state = stateString; + }, + + // rejected_callbacks.disable + // fulfilled_callbacks.disable + tuples[ 3 - i ][ 2 ].disable, + + // rejected_handlers.disable + // fulfilled_handlers.disable + tuples[ 3 - i ][ 3 ].disable, + + // progress_callbacks.lock + tuples[ 0 ][ 2 ].lock, + + // progress_handlers.lock + tuples[ 0 ][ 3 ].lock + ); + } + + // progress_handlers.fire + // fulfilled_handlers.fire + // rejected_handlers.fire + list.add( tuple[ 3 ].fire ); + + // deferred.notify = function() { deferred.notifyWith(...) } + // deferred.resolve = function() { deferred.resolveWith(...) } + // deferred.reject = function() { deferred.rejectWith(...) } + deferred[ tuple[ 0 ] ] = function() { + deferred[ tuple[ 0 ] + "With" ]( this === deferred ? undefined : this, arguments ); + return this; + }; + + // deferred.notifyWith = list.fireWith + // deferred.resolveWith = list.fireWith + // deferred.rejectWith = list.fireWith + deferred[ tuple[ 0 ] + "With" ] = list.fireWith; + } ); + + // Make the deferred a promise + promise.promise( deferred ); + + // Call given func if any + if ( func ) { + func.call( deferred, deferred ); + } + + // All done! + return deferred; + }, + + // Deferred helper + when: function( singleValue ) { + var + + // count of uncompleted subordinates + remaining = arguments.length, + + // count of unprocessed arguments + i = remaining, + + // subordinate fulfillment data + resolveContexts = Array( i ), + resolveValues = slice.call( arguments ), + + // the master Deferred + master = jQuery.Deferred(), + + // subordinate callback factory + updateFunc = function( i ) { + return function( value ) { + resolveContexts[ i ] = this; + resolveValues[ i ] = arguments.length > 1 ? slice.call( arguments ) : value; + if ( !( --remaining ) ) { + master.resolveWith( resolveContexts, resolveValues ); + } + }; + }; + + // Single- and empty arguments are adopted like Promise.resolve + if ( remaining <= 1 ) { + adoptValue( singleValue, master.done( updateFunc( i ) ).resolve, master.reject, + !remaining ); + + // Use .then() to unwrap secondary thenables (cf. gh-3000) + if ( master.state() === "pending" || + isFunction( resolveValues[ i ] && resolveValues[ i ].then ) ) { + + return master.then(); + } + } + + // Multiple arguments are aggregated like Promise.all array elements + while ( i-- ) { + adoptValue( resolveValues[ i ], updateFunc( i ), master.reject ); + } + + return master.promise(); + } +} ); + + +// These usually indicate a programmer mistake during development, +// warn about them ASAP rather than swallowing them by default. +var rerrorNames = /^(Eval|Internal|Range|Reference|Syntax|Type|URI)Error$/; + +jQuery.Deferred.exceptionHook = function( error, stack ) { + + // Support: IE 8 - 9 only + // Console exists when dev tools are open, which can happen at any time + if ( window.console && window.console.warn && error && rerrorNames.test( error.name ) ) { + window.console.warn( "jQuery.Deferred exception: " + error.message, error.stack, stack ); + } +}; + + + + +jQuery.readyException = function( error ) { + window.setTimeout( function() { + throw error; + } ); +}; + + + + +// The deferred used on DOM ready +var readyList = jQuery.Deferred(); + +jQuery.fn.ready = function( fn ) { + + readyList + .then( fn ) + + // Wrap jQuery.readyException in a function so that the lookup + // happens at the time of error handling instead of callback + // registration. + .catch( function( error ) { + jQuery.readyException( error ); + } ); + + return this; +}; + +jQuery.extend( { + + // Is the DOM ready to be used? Set to true once it occurs. + isReady: false, + + // A counter to track how many items to wait for before + // the ready event fires. See #6781 + readyWait: 1, + + // Handle when the DOM is ready + ready: function( wait ) { + + // Abort if there are pending holds or we're already ready + if ( wait === true ? --jQuery.readyWait : jQuery.isReady ) { + return; + } + + // Remember that the DOM is ready + jQuery.isReady = true; + + // If a normal DOM Ready event fired, decrement, and wait if need be + if ( wait !== true && --jQuery.readyWait > 0 ) { + return; + } + + // If there are functions bound, to execute + readyList.resolveWith( document, [ jQuery ] ); + } +} ); + +jQuery.ready.then = readyList.then; + +// The ready event handler and self cleanup method +function completed() { + document.removeEventListener( "DOMContentLoaded", completed ); + window.removeEventListener( "load", completed ); + jQuery.ready(); +} + +// Catch cases where $(document).ready() is called +// after the browser event has already occurred. +// Support: IE <=9 - 10 only +// Older IE sometimes signals "interactive" too soon +if ( document.readyState === "complete" || + ( document.readyState !== "loading" && !document.documentElement.doScroll ) ) { + + // Handle it asynchronously to allow scripts the opportunity to delay ready + window.setTimeout( jQuery.ready ); + +} else { + + // Use the handy event callback + document.addEventListener( "DOMContentLoaded", completed ); + + // A fallback to window.onload, that will always work + window.addEventListener( "load", completed ); +} + + + + +// Multifunctional method to get and set values of a collection +// The value/s can optionally be executed if it's a function +var access = function( elems, fn, key, value, chainable, emptyGet, raw ) { + var i = 0, + len = elems.length, + bulk = key == null; + + // Sets many values + if ( toType( key ) === "object" ) { + chainable = true; + for ( i in key ) { + access( elems, fn, i, key[ i ], true, emptyGet, raw ); + } + + // Sets one value + } else if ( value !== undefined ) { + chainable = true; + + if ( !isFunction( value ) ) { + raw = true; + } + + if ( bulk ) { + + // Bulk operations run against the entire set + if ( raw ) { + fn.call( elems, value ); + fn = null; + + // ...except when executing function values + } else { + bulk = fn; + fn = function( elem, _key, value ) { + return bulk.call( jQuery( elem ), value ); + }; + } + } + + if ( fn ) { + for ( ; i < len; i++ ) { + fn( + elems[ i ], key, raw ? + value : + value.call( elems[ i ], i, fn( elems[ i ], key ) ) + ); + } + } + } + + if ( chainable ) { + return elems; + } + + // Gets + if ( bulk ) { + return fn.call( elems ); + } + + return len ? fn( elems[ 0 ], key ) : emptyGet; +}; + + +// Matches dashed string for camelizing +var rmsPrefix = /^-ms-/, + rdashAlpha = /-([a-z])/g; + +// Used by camelCase as callback to replace() +function fcamelCase( _all, letter ) { + return letter.toUpperCase(); +} + +// Convert dashed to camelCase; used by the css and data modules +// Support: IE <=9 - 11, Edge 12 - 15 +// Microsoft forgot to hump their vendor prefix (#9572) +function camelCase( string ) { + return string.replace( rmsPrefix, "ms-" ).replace( rdashAlpha, fcamelCase ); +} +var acceptData = function( owner ) { + + // Accepts only: + // - Node + // - Node.ELEMENT_NODE + // - Node.DOCUMENT_NODE + // - Object + // - Any + return owner.nodeType === 1 || owner.nodeType === 9 || !( +owner.nodeType ); +}; + + + + +function Data() { + this.expando = jQuery.expando + Data.uid++; +} + +Data.uid = 1; + +Data.prototype = { + + cache: function( owner ) { + + // Check if the owner object already has a cache + var value = owner[ this.expando ]; + + // If not, create one + if ( !value ) { + value = {}; + + // We can accept data for non-element nodes in modern browsers, + // but we should not, see #8335. + // Always return an empty object. + if ( acceptData( owner ) ) { + + // If it is a node unlikely to be stringify-ed or looped over + // use plain assignment + if ( owner.nodeType ) { + owner[ this.expando ] = value; + + // Otherwise secure it in a non-enumerable property + // configurable must be true to allow the property to be + // deleted when data is removed + } else { + Object.defineProperty( owner, this.expando, { + value: value, + configurable: true + } ); + } + } + } + + return value; + }, + set: function( owner, data, value ) { + var prop, + cache = this.cache( owner ); + + // Handle: [ owner, key, value ] args + // Always use camelCase key (gh-2257) + if ( typeof data === "string" ) { + cache[ camelCase( data ) ] = value; + + // Handle: [ owner, { properties } ] args + } else { + + // Copy the properties one-by-one to the cache object + for ( prop in data ) { + cache[ camelCase( prop ) ] = data[ prop ]; + } + } + return cache; + }, + get: function( owner, key ) { + return key === undefined ? + this.cache( owner ) : + + // Always use camelCase key (gh-2257) + owner[ this.expando ] && owner[ this.expando ][ camelCase( key ) ]; + }, + access: function( owner, key, value ) { + + // In cases where either: + // + // 1. No key was specified + // 2. A string key was specified, but no value provided + // + // Take the "read" path and allow the get method to determine + // which value to return, respectively either: + // + // 1. The entire cache object + // 2. The data stored at the key + // + if ( key === undefined || + ( ( key && typeof key === "string" ) && value === undefined ) ) { + + return this.get( owner, key ); + } + + // When the key is not a string, or both a key and value + // are specified, set or extend (existing objects) with either: + // + // 1. An object of properties + // 2. A key and value + // + this.set( owner, key, value ); + + // Since the "set" path can have two possible entry points + // return the expected data based on which path was taken[*] + return value !== undefined ? value : key; + }, + remove: function( owner, key ) { + var i, + cache = owner[ this.expando ]; + + if ( cache === undefined ) { + return; + } + + if ( key !== undefined ) { + + // Support array or space separated string of keys + if ( Array.isArray( key ) ) { + + // If key is an array of keys... + // We always set camelCase keys, so remove that. + key = key.map( camelCase ); + } else { + key = camelCase( key ); + + // If a key with the spaces exists, use it. + // Otherwise, create an array by matching non-whitespace + key = key in cache ? + [ key ] : + ( key.match( rnothtmlwhite ) || [] ); + } + + i = key.length; + + while ( i-- ) { + delete cache[ key[ i ] ]; + } + } + + // Remove the expando if there's no more data + if ( key === undefined || jQuery.isEmptyObject( cache ) ) { + + // Support: Chrome <=35 - 45 + // Webkit & Blink performance suffers when deleting properties + // from DOM nodes, so set to undefined instead + // https://bugs.chromium.org/p/chromium/issues/detail?id=378607 (bug restricted) + if ( owner.nodeType ) { + owner[ this.expando ] = undefined; + } else { + delete owner[ this.expando ]; + } + } + }, + hasData: function( owner ) { + var cache = owner[ this.expando ]; + return cache !== undefined && !jQuery.isEmptyObject( cache ); + } +}; +var dataPriv = new Data(); + +var dataUser = new Data(); + + + +// Implementation Summary +// +// 1. Enforce API surface and semantic compatibility with 1.9.x branch +// 2. Improve the module's maintainability by reducing the storage +// paths to a single mechanism. +// 3. Use the same single mechanism to support "private" and "user" data. +// 4. _Never_ expose "private" data to user code (TODO: Drop _data, _removeData) +// 5. Avoid exposing implementation details on user objects (eg. expando properties) +// 6. Provide a clear path for implementation upgrade to WeakMap in 2014 + +var rbrace = /^(?:\{[\w\W]*\}|\[[\w\W]*\])$/, + rmultiDash = /[A-Z]/g; + +function getData( data ) { + if ( data === "true" ) { + return true; + } + + if ( data === "false" ) { + return false; + } + + if ( data === "null" ) { + return null; + } + + // Only convert to a number if it doesn't change the string + if ( data === +data + "" ) { + return +data; + } + + if ( rbrace.test( data ) ) { + return JSON.parse( data ); + } + + return data; +} + +function dataAttr( elem, key, data ) { + var name; + + // If nothing was found internally, try to fetch any + // data from the HTML5 data-* attribute + if ( data === undefined && elem.nodeType === 1 ) { + name = "data-" + key.replace( rmultiDash, "-$&" ).toLowerCase(); + data = elem.getAttribute( name ); + + if ( typeof data === "string" ) { + try { + data = getData( data ); + } catch ( e ) {} + + // Make sure we set the data so it isn't changed later + dataUser.set( elem, key, data ); + } else { + data = undefined; + } + } + return data; +} + +jQuery.extend( { + hasData: function( elem ) { + return dataUser.hasData( elem ) || dataPriv.hasData( elem ); + }, + + data: function( elem, name, data ) { + return dataUser.access( elem, name, data ); + }, + + removeData: function( elem, name ) { + dataUser.remove( elem, name ); + }, + + // TODO: Now that all calls to _data and _removeData have been replaced + // with direct calls to dataPriv methods, these can be deprecated. + _data: function( elem, name, data ) { + return dataPriv.access( elem, name, data ); + }, + + _removeData: function( elem, name ) { + dataPriv.remove( elem, name ); + } +} ); + +jQuery.fn.extend( { + data: function( key, value ) { + var i, name, data, + elem = this[ 0 ], + attrs = elem && elem.attributes; + + // Gets all values + if ( key === undefined ) { + if ( this.length ) { + data = dataUser.get( elem ); + + if ( elem.nodeType === 1 && !dataPriv.get( elem, "hasDataAttrs" ) ) { + i = attrs.length; + while ( i-- ) { + + // Support: IE 11 only + // The attrs elements can be null (#14894) + if ( attrs[ i ] ) { + name = attrs[ i ].name; + if ( name.indexOf( "data-" ) === 0 ) { + name = camelCase( name.slice( 5 ) ); + dataAttr( elem, name, data[ name ] ); + } + } + } + dataPriv.set( elem, "hasDataAttrs", true ); + } + } + + return data; + } + + // Sets multiple values + if ( typeof key === "object" ) { + return this.each( function() { + dataUser.set( this, key ); + } ); + } + + return access( this, function( value ) { + var data; + + // The calling jQuery object (element matches) is not empty + // (and therefore has an element appears at this[ 0 ]) and the + // `value` parameter was not undefined. An empty jQuery object + // will result in `undefined` for elem = this[ 0 ] which will + // throw an exception if an attempt to read a data cache is made. + if ( elem && value === undefined ) { + + // Attempt to get data from the cache + // The key will always be camelCased in Data + data = dataUser.get( elem, key ); + if ( data !== undefined ) { + return data; + } + + // Attempt to "discover" the data in + // HTML5 custom data-* attrs + data = dataAttr( elem, key ); + if ( data !== undefined ) { + return data; + } + + // We tried really hard, but the data doesn't exist. + return; + } + + // Set the data... + this.each( function() { + + // We always store the camelCased key + dataUser.set( this, key, value ); + } ); + }, null, value, arguments.length > 1, null, true ); + }, + + removeData: function( key ) { + return this.each( function() { + dataUser.remove( this, key ); + } ); + } +} ); + + +jQuery.extend( { + queue: function( elem, type, data ) { + var queue; + + if ( elem ) { + type = ( type || "fx" ) + "queue"; + queue = dataPriv.get( elem, type ); + + // Speed up dequeue by getting out quickly if this is just a lookup + if ( data ) { + if ( !queue || Array.isArray( data ) ) { + queue = dataPriv.access( elem, type, jQuery.makeArray( data ) ); + } else { + queue.push( data ); + } + } + return queue || []; + } + }, + + dequeue: function( elem, type ) { + type = type || "fx"; + + var queue = jQuery.queue( elem, type ), + startLength = queue.length, + fn = queue.shift(), + hooks = jQuery._queueHooks( elem, type ), + next = function() { + jQuery.dequeue( elem, type ); + }; + + // If the fx queue is dequeued, always remove the progress sentinel + if ( fn === "inprogress" ) { + fn = queue.shift(); + startLength--; + } + + if ( fn ) { + + // Add a progress sentinel to prevent the fx queue from being + // automatically dequeued + if ( type === "fx" ) { + queue.unshift( "inprogress" ); + } + + // Clear up the last queue stop function + delete hooks.stop; + fn.call( elem, next, hooks ); + } + + if ( !startLength && hooks ) { + hooks.empty.fire(); + } + }, + + // Not public - generate a queueHooks object, or return the current one + _queueHooks: function( elem, type ) { + var key = type + "queueHooks"; + return dataPriv.get( elem, key ) || dataPriv.access( elem, key, { + empty: jQuery.Callbacks( "once memory" ).add( function() { + dataPriv.remove( elem, [ type + "queue", key ] ); + } ) + } ); + } +} ); + +jQuery.fn.extend( { + queue: function( type, data ) { + var setter = 2; + + if ( typeof type !== "string" ) { + data = type; + type = "fx"; + setter--; + } + + if ( arguments.length < setter ) { + return jQuery.queue( this[ 0 ], type ); + } + + return data === undefined ? + this : + this.each( function() { + var queue = jQuery.queue( this, type, data ); + + // Ensure a hooks for this queue + jQuery._queueHooks( this, type ); + + if ( type === "fx" && queue[ 0 ] !== "inprogress" ) { + jQuery.dequeue( this, type ); + } + } ); + }, + dequeue: function( type ) { + return this.each( function() { + jQuery.dequeue( this, type ); + } ); + }, + clearQueue: function( type ) { + return this.queue( type || "fx", [] ); + }, + + // Get a promise resolved when queues of a certain type + // are emptied (fx is the type by default) + promise: function( type, obj ) { + var tmp, + count = 1, + defer = jQuery.Deferred(), + elements = this, + i = this.length, + resolve = function() { + if ( !( --count ) ) { + defer.resolveWith( elements, [ elements ] ); + } + }; + + if ( typeof type !== "string" ) { + obj = type; + type = undefined; + } + type = type || "fx"; + + while ( i-- ) { + tmp = dataPriv.get( elements[ i ], type + "queueHooks" ); + if ( tmp && tmp.empty ) { + count++; + tmp.empty.add( resolve ); + } + } + resolve(); + return defer.promise( obj ); + } +} ); +var pnum = ( /[+-]?(?:\d*\.|)\d+(?:[eE][+-]?\d+|)/ ).source; + +var rcssNum = new RegExp( "^(?:([+-])=|)(" + pnum + ")([a-z%]*)$", "i" ); + + +var cssExpand = [ "Top", "Right", "Bottom", "Left" ]; + +var documentElement = document.documentElement; + + + + var isAttached = function( elem ) { + return jQuery.contains( elem.ownerDocument, elem ); + }, + composed = { composed: true }; + + // Support: IE 9 - 11+, Edge 12 - 18+, iOS 10.0 - 10.2 only + // Check attachment across shadow DOM boundaries when possible (gh-3504) + // Support: iOS 10.0-10.2 only + // Early iOS 10 versions support `attachShadow` but not `getRootNode`, + // leading to errors. We need to check for `getRootNode`. + if ( documentElement.getRootNode ) { + isAttached = function( elem ) { + return jQuery.contains( elem.ownerDocument, elem ) || + elem.getRootNode( composed ) === elem.ownerDocument; + }; + } +var isHiddenWithinTree = function( elem, el ) { + + // isHiddenWithinTree might be called from jQuery#filter function; + // in that case, element will be second argument + elem = el || elem; + + // Inline style trumps all + return elem.style.display === "none" || + elem.style.display === "" && + + // Otherwise, check computed style + // Support: Firefox <=43 - 45 + // Disconnected elements can have computed display: none, so first confirm that elem is + // in the document. + isAttached( elem ) && + + jQuery.css( elem, "display" ) === "none"; + }; + + + +function adjustCSS( elem, prop, valueParts, tween ) { + var adjusted, scale, + maxIterations = 20, + currentValue = tween ? + function() { + return tween.cur(); + } : + function() { + return jQuery.css( elem, prop, "" ); + }, + initial = currentValue(), + unit = valueParts && valueParts[ 3 ] || ( jQuery.cssNumber[ prop ] ? "" : "px" ), + + // Starting value computation is required for potential unit mismatches + initialInUnit = elem.nodeType && + ( jQuery.cssNumber[ prop ] || unit !== "px" && +initial ) && + rcssNum.exec( jQuery.css( elem, prop ) ); + + if ( initialInUnit && initialInUnit[ 3 ] !== unit ) { + + // Support: Firefox <=54 + // Halve the iteration target value to prevent interference from CSS upper bounds (gh-2144) + initial = initial / 2; + + // Trust units reported by jQuery.css + unit = unit || initialInUnit[ 3 ]; + + // Iteratively approximate from a nonzero starting point + initialInUnit = +initial || 1; + + while ( maxIterations-- ) { + + // Evaluate and update our best guess (doubling guesses that zero out). + // Finish if the scale equals or crosses 1 (making the old*new product non-positive). + jQuery.style( elem, prop, initialInUnit + unit ); + if ( ( 1 - scale ) * ( 1 - ( scale = currentValue() / initial || 0.5 ) ) <= 0 ) { + maxIterations = 0; + } + initialInUnit = initialInUnit / scale; + + } + + initialInUnit = initialInUnit * 2; + jQuery.style( elem, prop, initialInUnit + unit ); + + // Make sure we update the tween properties later on + valueParts = valueParts || []; + } + + if ( valueParts ) { + initialInUnit = +initialInUnit || +initial || 0; + + // Apply relative offset (+=/-=) if specified + adjusted = valueParts[ 1 ] ? + initialInUnit + ( valueParts[ 1 ] + 1 ) * valueParts[ 2 ] : + +valueParts[ 2 ]; + if ( tween ) { + tween.unit = unit; + tween.start = initialInUnit; + tween.end = adjusted; + } + } + return adjusted; +} + + +var defaultDisplayMap = {}; + +function getDefaultDisplay( elem ) { + var temp, + doc = elem.ownerDocument, + nodeName = elem.nodeName, + display = defaultDisplayMap[ nodeName ]; + + if ( display ) { + return display; + } + + temp = doc.body.appendChild( doc.createElement( nodeName ) ); + display = jQuery.css( temp, "display" ); + + temp.parentNode.removeChild( temp ); + + if ( display === "none" ) { + display = "block"; + } + defaultDisplayMap[ nodeName ] = display; + + return display; +} + +function showHide( elements, show ) { + var display, elem, + values = [], + index = 0, + length = elements.length; + + // Determine new display value for elements that need to change + for ( ; index < length; index++ ) { + elem = elements[ index ]; + if ( !elem.style ) { + continue; + } + + display = elem.style.display; + if ( show ) { + + // Since we force visibility upon cascade-hidden elements, an immediate (and slow) + // check is required in this first loop unless we have a nonempty display value (either + // inline or about-to-be-restored) + if ( display === "none" ) { + values[ index ] = dataPriv.get( elem, "display" ) || null; + if ( !values[ index ] ) { + elem.style.display = ""; + } + } + if ( elem.style.display === "" && isHiddenWithinTree( elem ) ) { + values[ index ] = getDefaultDisplay( elem ); + } + } else { + if ( display !== "none" ) { + values[ index ] = "none"; + + // Remember what we're overwriting + dataPriv.set( elem, "display", display ); + } + } + } + + // Set the display of the elements in a second loop to avoid constant reflow + for ( index = 0; index < length; index++ ) { + if ( values[ index ] != null ) { + elements[ index ].style.display = values[ index ]; + } + } + + return elements; +} + +jQuery.fn.extend( { + show: function() { + return showHide( this, true ); + }, + hide: function() { + return showHide( this ); + }, + toggle: function( state ) { + if ( typeof state === "boolean" ) { + return state ? this.show() : this.hide(); + } + + return this.each( function() { + if ( isHiddenWithinTree( this ) ) { + jQuery( this ).show(); + } else { + jQuery( this ).hide(); + } + } ); + } +} ); +var rcheckableType = ( /^(?:checkbox|radio)$/i ); + +var rtagName = ( /<([a-z][^\/\0>\x20\t\r\n\f]*)/i ); + +var rscriptType = ( /^$|^module$|\/(?:java|ecma)script/i ); + + + +( function() { + var fragment = document.createDocumentFragment(), + div = fragment.appendChild( document.createElement( "div" ) ), + input = document.createElement( "input" ); + + // Support: Android 4.0 - 4.3 only + // Check state lost if the name is set (#11217) + // Support: Windows Web Apps (WWA) + // `name` and `type` must use .setAttribute for WWA (#14901) + input.setAttribute( "type", "radio" ); + input.setAttribute( "checked", "checked" ); + input.setAttribute( "name", "t" ); + + div.appendChild( input ); + + // Support: Android <=4.1 only + // Older WebKit doesn't clone checked state correctly in fragments + support.checkClone = div.cloneNode( true ).cloneNode( true ).lastChild.checked; + + // Support: IE <=11 only + // Make sure textarea (and checkbox) defaultValue is properly cloned + div.innerHTML = ""; + support.noCloneChecked = !!div.cloneNode( true ).lastChild.defaultValue; + + // Support: IE <=9 only + // IE <=9 replaces "; + support.option = !!div.lastChild; +} )(); + + +// We have to close these tags to support XHTML (#13200) +var wrapMap = { + + // XHTML parsers do not magically insert elements in the + // same way that tag soup parsers do. So we cannot shorten + // this by omitting

    or other required elements. + thead: [ 1, "
    Type:boolean
    Default:false
    ", "
    " ], + col: [ 2, "", "
    " ], + tr: [ 2, "", "
    " ], + td: [ 3, "", "
    " ], + + _default: [ 0, "", "" ] +}; + +wrapMap.tbody = wrapMap.tfoot = wrapMap.colgroup = wrapMap.caption = wrapMap.thead; +wrapMap.th = wrapMap.td; + +// Support: IE <=9 only +if ( !support.option ) { + wrapMap.optgroup = wrapMap.option = [ 1, "" ]; +} + + +function getAll( context, tag ) { + + // Support: IE <=9 - 11 only + // Use typeof to avoid zero-argument method invocation on host objects (#15151) + var ret; + + if ( typeof context.getElementsByTagName !== "undefined" ) { + ret = context.getElementsByTagName( tag || "*" ); + + } else if ( typeof context.querySelectorAll !== "undefined" ) { + ret = context.querySelectorAll( tag || "*" ); + + } else { + ret = []; + } + + if ( tag === undefined || tag && nodeName( context, tag ) ) { + return jQuery.merge( [ context ], ret ); + } + + return ret; +} + + +// Mark scripts as having already been evaluated +function setGlobalEval( elems, refElements ) { + var i = 0, + l = elems.length; + + for ( ; i < l; i++ ) { + dataPriv.set( + elems[ i ], + "globalEval", + !refElements || dataPriv.get( refElements[ i ], "globalEval" ) + ); + } +} + + +var rhtml = /<|&#?\w+;/; + +function buildFragment( elems, context, scripts, selection, ignored ) { + var elem, tmp, tag, wrap, attached, j, + fragment = context.createDocumentFragment(), + nodes = [], + i = 0, + l = elems.length; + + for ( ; i < l; i++ ) { + elem = elems[ i ]; + + if ( elem || elem === 0 ) { + + // Add nodes directly + if ( toType( elem ) === "object" ) { + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( nodes, elem.nodeType ? [ elem ] : elem ); + + // Convert non-html into a text node + } else if ( !rhtml.test( elem ) ) { + nodes.push( context.createTextNode( elem ) ); + + // Convert html into DOM nodes + } else { + tmp = tmp || fragment.appendChild( context.createElement( "div" ) ); + + // Deserialize a standard representation + tag = ( rtagName.exec( elem ) || [ "", "" ] )[ 1 ].toLowerCase(); + wrap = wrapMap[ tag ] || wrapMap._default; + tmp.innerHTML = wrap[ 1 ] + jQuery.htmlPrefilter( elem ) + wrap[ 2 ]; + + // Descend through wrappers to the right content + j = wrap[ 0 ]; + while ( j-- ) { + tmp = tmp.lastChild; + } + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( nodes, tmp.childNodes ); + + // Remember the top-level container + tmp = fragment.firstChild; + + // Ensure the created nodes are orphaned (#12392) + tmp.textContent = ""; + } + } + } + + // Remove wrapper from fragment + fragment.textContent = ""; + + i = 0; + while ( ( elem = nodes[ i++ ] ) ) { + + // Skip elements already in the context collection (trac-4087) + if ( selection && jQuery.inArray( elem, selection ) > -1 ) { + if ( ignored ) { + ignored.push( elem ); + } + continue; + } + + attached = isAttached( elem ); + + // Append to fragment + tmp = getAll( fragment.appendChild( elem ), "script" ); + + // Preserve script evaluation history + if ( attached ) { + setGlobalEval( tmp ); + } + + // Capture executables + if ( scripts ) { + j = 0; + while ( ( elem = tmp[ j++ ] ) ) { + if ( rscriptType.test( elem.type || "" ) ) { + scripts.push( elem ); + } + } + } + } + + return fragment; +} + + +var + rkeyEvent = /^key/, + rmouseEvent = /^(?:mouse|pointer|contextmenu|drag|drop)|click/, + rtypenamespace = /^([^.]*)(?:\.(.+)|)/; + +function returnTrue() { + return true; +} + +function returnFalse() { + return false; +} + +// Support: IE <=9 - 11+ +// focus() and blur() are asynchronous, except when they are no-op. +// So expect focus to be synchronous when the element is already active, +// and blur to be synchronous when the element is not already active. +// (focus and blur are always synchronous in other supported browsers, +// this just defines when we can count on it). +function expectSync( elem, type ) { + return ( elem === safeActiveElement() ) === ( type === "focus" ); +} + +// Support: IE <=9 only +// Accessing document.activeElement can throw unexpectedly +// https://bugs.jquery.com/ticket/13393 +function safeActiveElement() { + try { + return document.activeElement; + } catch ( err ) { } +} + +function on( elem, types, selector, data, fn, one ) { + var origFn, type; + + // Types can be a map of types/handlers + if ( typeof types === "object" ) { + + // ( types-Object, selector, data ) + if ( typeof selector !== "string" ) { + + // ( types-Object, data ) + data = data || selector; + selector = undefined; + } + for ( type in types ) { + on( elem, type, selector, data, types[ type ], one ); + } + return elem; + } + + if ( data == null && fn == null ) { + + // ( types, fn ) + fn = selector; + data = selector = undefined; + } else if ( fn == null ) { + if ( typeof selector === "string" ) { + + // ( types, selector, fn ) + fn = data; + data = undefined; + } else { + + // ( types, data, fn ) + fn = data; + data = selector; + selector = undefined; + } + } + if ( fn === false ) { + fn = returnFalse; + } else if ( !fn ) { + return elem; + } + + if ( one === 1 ) { + origFn = fn; + fn = function( event ) { + + // Can use an empty set, since event contains the info + jQuery().off( event ); + return origFn.apply( this, arguments ); + }; + + // Use same guid so caller can remove using origFn + fn.guid = origFn.guid || ( origFn.guid = jQuery.guid++ ); + } + return elem.each( function() { + jQuery.event.add( this, types, fn, data, selector ); + } ); +} + +/* + * Helper functions for managing events -- not part of the public interface. + * Props to Dean Edwards' addEvent library for many of the ideas. + */ +jQuery.event = { + + global: {}, + + add: function( elem, types, handler, data, selector ) { + + var handleObjIn, eventHandle, tmp, + events, t, handleObj, + special, handlers, type, namespaces, origType, + elemData = dataPriv.get( elem ); + + // Only attach events to objects that accept data + if ( !acceptData( elem ) ) { + return; + } + + // Caller can pass in an object of custom data in lieu of the handler + if ( handler.handler ) { + handleObjIn = handler; + handler = handleObjIn.handler; + selector = handleObjIn.selector; + } + + // Ensure that invalid selectors throw exceptions at attach time + // Evaluate against documentElement in case elem is a non-element node (e.g., document) + if ( selector ) { + jQuery.find.matchesSelector( documentElement, selector ); + } + + // Make sure that the handler has a unique ID, used to find/remove it later + if ( !handler.guid ) { + handler.guid = jQuery.guid++; + } + + // Init the element's event structure and main handler, if this is the first + if ( !( events = elemData.events ) ) { + events = elemData.events = Object.create( null ); + } + if ( !( eventHandle = elemData.handle ) ) { + eventHandle = elemData.handle = function( e ) { + + // Discard the second event of a jQuery.event.trigger() and + // when an event is called after a page has unloaded + return typeof jQuery !== "undefined" && jQuery.event.triggered !== e.type ? + jQuery.event.dispatch.apply( elem, arguments ) : undefined; + }; + } + + // Handle multiple events separated by a space + types = ( types || "" ).match( rnothtmlwhite ) || [ "" ]; + t = types.length; + while ( t-- ) { + tmp = rtypenamespace.exec( types[ t ] ) || []; + type = origType = tmp[ 1 ]; + namespaces = ( tmp[ 2 ] || "" ).split( "." ).sort(); + + // There *must* be a type, no attaching namespace-only handlers + if ( !type ) { + continue; + } + + // If event changes its type, use the special event handlers for the changed type + special = jQuery.event.special[ type ] || {}; + + // If selector defined, determine special event api type, otherwise given type + type = ( selector ? special.delegateType : special.bindType ) || type; + + // Update special based on newly reset type + special = jQuery.event.special[ type ] || {}; + + // handleObj is passed to all event handlers + handleObj = jQuery.extend( { + type: type, + origType: origType, + data: data, + handler: handler, + guid: handler.guid, + selector: selector, + needsContext: selector && jQuery.expr.match.needsContext.test( selector ), + namespace: namespaces.join( "." ) + }, handleObjIn ); + + // Init the event handler queue if we're the first + if ( !( handlers = events[ type ] ) ) { + handlers = events[ type ] = []; + handlers.delegateCount = 0; + + // Only use addEventListener if the special events handler returns false + if ( !special.setup || + special.setup.call( elem, data, namespaces, eventHandle ) === false ) { + + if ( elem.addEventListener ) { + elem.addEventListener( type, eventHandle ); + } + } + } + + if ( special.add ) { + special.add.call( elem, handleObj ); + + if ( !handleObj.handler.guid ) { + handleObj.handler.guid = handler.guid; + } + } + + // Add to the element's handler list, delegates in front + if ( selector ) { + handlers.splice( handlers.delegateCount++, 0, handleObj ); + } else { + handlers.push( handleObj ); + } + + // Keep track of which events have ever been used, for event optimization + jQuery.event.global[ type ] = true; + } + + }, + + // Detach an event or set of events from an element + remove: function( elem, types, handler, selector, mappedTypes ) { + + var j, origCount, tmp, + events, t, handleObj, + special, handlers, type, namespaces, origType, + elemData = dataPriv.hasData( elem ) && dataPriv.get( elem ); + + if ( !elemData || !( events = elemData.events ) ) { + return; + } + + // Once for each type.namespace in types; type may be omitted + types = ( types || "" ).match( rnothtmlwhite ) || [ "" ]; + t = types.length; + while ( t-- ) { + tmp = rtypenamespace.exec( types[ t ] ) || []; + type = origType = tmp[ 1 ]; + namespaces = ( tmp[ 2 ] || "" ).split( "." ).sort(); + + // Unbind all events (on this namespace, if provided) for the element + if ( !type ) { + for ( type in events ) { + jQuery.event.remove( elem, type + types[ t ], handler, selector, true ); + } + continue; + } + + special = jQuery.event.special[ type ] || {}; + type = ( selector ? special.delegateType : special.bindType ) || type; + handlers = events[ type ] || []; + tmp = tmp[ 2 ] && + new RegExp( "(^|\\.)" + namespaces.join( "\\.(?:.*\\.|)" ) + "(\\.|$)" ); + + // Remove matching events + origCount = j = handlers.length; + while ( j-- ) { + handleObj = handlers[ j ]; + + if ( ( mappedTypes || origType === handleObj.origType ) && + ( !handler || handler.guid === handleObj.guid ) && + ( !tmp || tmp.test( handleObj.namespace ) ) && + ( !selector || selector === handleObj.selector || + selector === "**" && handleObj.selector ) ) { + handlers.splice( j, 1 ); + + if ( handleObj.selector ) { + handlers.delegateCount--; + } + if ( special.remove ) { + special.remove.call( elem, handleObj ); + } + } + } + + // Remove generic event handler if we removed something and no more handlers exist + // (avoids potential for endless recursion during removal of special event handlers) + if ( origCount && !handlers.length ) { + if ( !special.teardown || + special.teardown.call( elem, namespaces, elemData.handle ) === false ) { + + jQuery.removeEvent( elem, type, elemData.handle ); + } + + delete events[ type ]; + } + } + + // Remove data and the expando if it's no longer used + if ( jQuery.isEmptyObject( events ) ) { + dataPriv.remove( elem, "handle events" ); + } + }, + + dispatch: function( nativeEvent ) { + + var i, j, ret, matched, handleObj, handlerQueue, + args = new Array( arguments.length ), + + // Make a writable jQuery.Event from the native event object + event = jQuery.event.fix( nativeEvent ), + + handlers = ( + dataPriv.get( this, "events" ) || Object.create( null ) + )[ event.type ] || [], + special = jQuery.event.special[ event.type ] || {}; + + // Use the fix-ed jQuery.Event rather than the (read-only) native event + args[ 0 ] = event; + + for ( i = 1; i < arguments.length; i++ ) { + args[ i ] = arguments[ i ]; + } + + event.delegateTarget = this; + + // Call the preDispatch hook for the mapped type, and let it bail if desired + if ( special.preDispatch && special.preDispatch.call( this, event ) === false ) { + return; + } + + // Determine handlers + handlerQueue = jQuery.event.handlers.call( this, event, handlers ); + + // Run delegates first; they may want to stop propagation beneath us + i = 0; + while ( ( matched = handlerQueue[ i++ ] ) && !event.isPropagationStopped() ) { + event.currentTarget = matched.elem; + + j = 0; + while ( ( handleObj = matched.handlers[ j++ ] ) && + !event.isImmediatePropagationStopped() ) { + + // If the event is namespaced, then each handler is only invoked if it is + // specially universal or its namespaces are a superset of the event's. + if ( !event.rnamespace || handleObj.namespace === false || + event.rnamespace.test( handleObj.namespace ) ) { + + event.handleObj = handleObj; + event.data = handleObj.data; + + ret = ( ( jQuery.event.special[ handleObj.origType ] || {} ).handle || + handleObj.handler ).apply( matched.elem, args ); + + if ( ret !== undefined ) { + if ( ( event.result = ret ) === false ) { + event.preventDefault(); + event.stopPropagation(); + } + } + } + } + } + + // Call the postDispatch hook for the mapped type + if ( special.postDispatch ) { + special.postDispatch.call( this, event ); + } + + return event.result; + }, + + handlers: function( event, handlers ) { + var i, handleObj, sel, matchedHandlers, matchedSelectors, + handlerQueue = [], + delegateCount = handlers.delegateCount, + cur = event.target; + + // Find delegate handlers + if ( delegateCount && + + // Support: IE <=9 + // Black-hole SVG instance trees (trac-13180) + cur.nodeType && + + // Support: Firefox <=42 + // Suppress spec-violating clicks indicating a non-primary pointer button (trac-3861) + // https://www.w3.org/TR/DOM-Level-3-Events/#event-type-click + // Support: IE 11 only + // ...but not arrow key "clicks" of radio inputs, which can have `button` -1 (gh-2343) + !( event.type === "click" && event.button >= 1 ) ) { + + for ( ; cur !== this; cur = cur.parentNode || this ) { + + // Don't check non-elements (#13208) + // Don't process clicks on disabled elements (#6911, #8165, #11382, #11764) + if ( cur.nodeType === 1 && !( event.type === "click" && cur.disabled === true ) ) { + matchedHandlers = []; + matchedSelectors = {}; + for ( i = 0; i < delegateCount; i++ ) { + handleObj = handlers[ i ]; + + // Don't conflict with Object.prototype properties (#13203) + sel = handleObj.selector + " "; + + if ( matchedSelectors[ sel ] === undefined ) { + matchedSelectors[ sel ] = handleObj.needsContext ? + jQuery( sel, this ).index( cur ) > -1 : + jQuery.find( sel, this, null, [ cur ] ).length; + } + if ( matchedSelectors[ sel ] ) { + matchedHandlers.push( handleObj ); + } + } + if ( matchedHandlers.length ) { + handlerQueue.push( { elem: cur, handlers: matchedHandlers } ); + } + } + } + } + + // Add the remaining (directly-bound) handlers + cur = this; + if ( delegateCount < handlers.length ) { + handlerQueue.push( { elem: cur, handlers: handlers.slice( delegateCount ) } ); + } + + return handlerQueue; + }, + + addProp: function( name, hook ) { + Object.defineProperty( jQuery.Event.prototype, name, { + enumerable: true, + configurable: true, + + get: isFunction( hook ) ? + function() { + if ( this.originalEvent ) { + return hook( this.originalEvent ); + } + } : + function() { + if ( this.originalEvent ) { + return this.originalEvent[ name ]; + } + }, + + set: function( value ) { + Object.defineProperty( this, name, { + enumerable: true, + configurable: true, + writable: true, + value: value + } ); + } + } ); + }, + + fix: function( originalEvent ) { + return originalEvent[ jQuery.expando ] ? + originalEvent : + new jQuery.Event( originalEvent ); + }, + + special: { + load: { + + // Prevent triggered image.load events from bubbling to window.load + noBubble: true + }, + click: { + + // Utilize native event to ensure correct state for checkable inputs + setup: function( data ) { + + // For mutual compressibility with _default, replace `this` access with a local var. + // `|| data` is dead code meant only to preserve the variable through minification. + var el = this || data; + + // Claim the first handler + if ( rcheckableType.test( el.type ) && + el.click && nodeName( el, "input" ) ) { + + // dataPriv.set( el, "click", ... ) + leverageNative( el, "click", returnTrue ); + } + + // Return false to allow normal processing in the caller + return false; + }, + trigger: function( data ) { + + // For mutual compressibility with _default, replace `this` access with a local var. + // `|| data` is dead code meant only to preserve the variable through minification. + var el = this || data; + + // Force setup before triggering a click + if ( rcheckableType.test( el.type ) && + el.click && nodeName( el, "input" ) ) { + + leverageNative( el, "click" ); + } + + // Return non-false to allow normal event-path propagation + return true; + }, + + // For cross-browser consistency, suppress native .click() on links + // Also prevent it if we're currently inside a leveraged native-event stack + _default: function( event ) { + var target = event.target; + return rcheckableType.test( target.type ) && + target.click && nodeName( target, "input" ) && + dataPriv.get( target, "click" ) || + nodeName( target, "a" ); + } + }, + + beforeunload: { + postDispatch: function( event ) { + + // Support: Firefox 20+ + // Firefox doesn't alert if the returnValue field is not set. + if ( event.result !== undefined && event.originalEvent ) { + event.originalEvent.returnValue = event.result; + } + } + } + } +}; + +// Ensure the presence of an event listener that handles manually-triggered +// synthetic events by interrupting progress until reinvoked in response to +// *native* events that it fires directly, ensuring that state changes have +// already occurred before other listeners are invoked. +function leverageNative( el, type, expectSync ) { + + // Missing expectSync indicates a trigger call, which must force setup through jQuery.event.add + if ( !expectSync ) { + if ( dataPriv.get( el, type ) === undefined ) { + jQuery.event.add( el, type, returnTrue ); + } + return; + } + + // Register the controller as a special universal handler for all event namespaces + dataPriv.set( el, type, false ); + jQuery.event.add( el, type, { + namespace: false, + handler: function( event ) { + var notAsync, result, + saved = dataPriv.get( this, type ); + + if ( ( event.isTrigger & 1 ) && this[ type ] ) { + + // Interrupt processing of the outer synthetic .trigger()ed event + // Saved data should be false in such cases, but might be a leftover capture object + // from an async native handler (gh-4350) + if ( !saved.length ) { + + // Store arguments for use when handling the inner native event + // There will always be at least one argument (an event object), so this array + // will not be confused with a leftover capture object. + saved = slice.call( arguments ); + dataPriv.set( this, type, saved ); + + // Trigger the native event and capture its result + // Support: IE <=9 - 11+ + // focus() and blur() are asynchronous + notAsync = expectSync( this, type ); + this[ type ](); + result = dataPriv.get( this, type ); + if ( saved !== result || notAsync ) { + dataPriv.set( this, type, false ); + } else { + result = {}; + } + if ( saved !== result ) { + + // Cancel the outer synthetic event + event.stopImmediatePropagation(); + event.preventDefault(); + return result.value; + } + + // If this is an inner synthetic event for an event with a bubbling surrogate + // (focus or blur), assume that the surrogate already propagated from triggering the + // native event and prevent that from happening again here. + // This technically gets the ordering wrong w.r.t. to `.trigger()` (in which the + // bubbling surrogate propagates *after* the non-bubbling base), but that seems + // less bad than duplication. + } else if ( ( jQuery.event.special[ type ] || {} ).delegateType ) { + event.stopPropagation(); + } + + // If this is a native event triggered above, everything is now in order + // Fire an inner synthetic event with the original arguments + } else if ( saved.length ) { + + // ...and capture the result + dataPriv.set( this, type, { + value: jQuery.event.trigger( + + // Support: IE <=9 - 11+ + // Extend with the prototype to reset the above stopImmediatePropagation() + jQuery.extend( saved[ 0 ], jQuery.Event.prototype ), + saved.slice( 1 ), + this + ) + } ); + + // Abort handling of the native event + event.stopImmediatePropagation(); + } + } + } ); +} + +jQuery.removeEvent = function( elem, type, handle ) { + + // This "if" is needed for plain objects + if ( elem.removeEventListener ) { + elem.removeEventListener( type, handle ); + } +}; + +jQuery.Event = function( src, props ) { + + // Allow instantiation without the 'new' keyword + if ( !( this instanceof jQuery.Event ) ) { + return new jQuery.Event( src, props ); + } + + // Event object + if ( src && src.type ) { + this.originalEvent = src; + this.type = src.type; + + // Events bubbling up the document may have been marked as prevented + // by a handler lower down the tree; reflect the correct value. + this.isDefaultPrevented = src.defaultPrevented || + src.defaultPrevented === undefined && + + // Support: Android <=2.3 only + src.returnValue === false ? + returnTrue : + returnFalse; + + // Create target properties + // Support: Safari <=6 - 7 only + // Target should not be a text node (#504, #13143) + this.target = ( src.target && src.target.nodeType === 3 ) ? + src.target.parentNode : + src.target; + + this.currentTarget = src.currentTarget; + this.relatedTarget = src.relatedTarget; + + // Event type + } else { + this.type = src; + } + + // Put explicitly provided properties onto the event object + if ( props ) { + jQuery.extend( this, props ); + } + + // Create a timestamp if incoming event doesn't have one + this.timeStamp = src && src.timeStamp || Date.now(); + + // Mark it as fixed + this[ jQuery.expando ] = true; +}; + +// jQuery.Event is based on DOM3 Events as specified by the ECMAScript Language Binding +// https://www.w3.org/TR/2003/WD-DOM-Level-3-Events-20030331/ecma-script-binding.html +jQuery.Event.prototype = { + constructor: jQuery.Event, + isDefaultPrevented: returnFalse, + isPropagationStopped: returnFalse, + isImmediatePropagationStopped: returnFalse, + isSimulated: false, + + preventDefault: function() { + var e = this.originalEvent; + + this.isDefaultPrevented = returnTrue; + + if ( e && !this.isSimulated ) { + e.preventDefault(); + } + }, + stopPropagation: function() { + var e = this.originalEvent; + + this.isPropagationStopped = returnTrue; + + if ( e && !this.isSimulated ) { + e.stopPropagation(); + } + }, + stopImmediatePropagation: function() { + var e = this.originalEvent; + + this.isImmediatePropagationStopped = returnTrue; + + if ( e && !this.isSimulated ) { + e.stopImmediatePropagation(); + } + + this.stopPropagation(); + } +}; + +// Includes all common event props including KeyEvent and MouseEvent specific props +jQuery.each( { + altKey: true, + bubbles: true, + cancelable: true, + changedTouches: true, + ctrlKey: true, + detail: true, + eventPhase: true, + metaKey: true, + pageX: true, + pageY: true, + shiftKey: true, + view: true, + "char": true, + code: true, + charCode: true, + key: true, + keyCode: true, + button: true, + buttons: true, + clientX: true, + clientY: true, + offsetX: true, + offsetY: true, + pointerId: true, + pointerType: true, + screenX: true, + screenY: true, + targetTouches: true, + toElement: true, + touches: true, + + which: function( event ) { + var button = event.button; + + // Add which for key events + if ( event.which == null && rkeyEvent.test( event.type ) ) { + return event.charCode != null ? event.charCode : event.keyCode; + } + + // Add which for click: 1 === left; 2 === middle; 3 === right + if ( !event.which && button !== undefined && rmouseEvent.test( event.type ) ) { + if ( button & 1 ) { + return 1; + } + + if ( button & 2 ) { + return 3; + } + + if ( button & 4 ) { + return 2; + } + + return 0; + } + + return event.which; + } +}, jQuery.event.addProp ); + +jQuery.each( { focus: "focusin", blur: "focusout" }, function( type, delegateType ) { + jQuery.event.special[ type ] = { + + // Utilize native event if possible so blur/focus sequence is correct + setup: function() { + + // Claim the first handler + // dataPriv.set( this, "focus", ... ) + // dataPriv.set( this, "blur", ... ) + leverageNative( this, type, expectSync ); + + // Return false to allow normal processing in the caller + return false; + }, + trigger: function() { + + // Force setup before trigger + leverageNative( this, type ); + + // Return non-false to allow normal event-path propagation + return true; + }, + + delegateType: delegateType + }; +} ); + +// Create mouseenter/leave events using mouseover/out and event-time checks +// so that event delegation works in jQuery. +// Do the same for pointerenter/pointerleave and pointerover/pointerout +// +// Support: Safari 7 only +// Safari sends mouseenter too often; see: +// https://bugs.chromium.org/p/chromium/issues/detail?id=470258 +// for the description of the bug (it existed in older Chrome versions as well). +jQuery.each( { + mouseenter: "mouseover", + mouseleave: "mouseout", + pointerenter: "pointerover", + pointerleave: "pointerout" +}, function( orig, fix ) { + jQuery.event.special[ orig ] = { + delegateType: fix, + bindType: fix, + + handle: function( event ) { + var ret, + target = this, + related = event.relatedTarget, + handleObj = event.handleObj; + + // For mouseenter/leave call the handler if related is outside the target. + // NB: No relatedTarget if the mouse left/entered the browser window + if ( !related || ( related !== target && !jQuery.contains( target, related ) ) ) { + event.type = handleObj.origType; + ret = handleObj.handler.apply( this, arguments ); + event.type = fix; + } + return ret; + } + }; +} ); + +jQuery.fn.extend( { + + on: function( types, selector, data, fn ) { + return on( this, types, selector, data, fn ); + }, + one: function( types, selector, data, fn ) { + return on( this, types, selector, data, fn, 1 ); + }, + off: function( types, selector, fn ) { + var handleObj, type; + if ( types && types.preventDefault && types.handleObj ) { + + // ( event ) dispatched jQuery.Event + handleObj = types.handleObj; + jQuery( types.delegateTarget ).off( + handleObj.namespace ? + handleObj.origType + "." + handleObj.namespace : + handleObj.origType, + handleObj.selector, + handleObj.handler + ); + return this; + } + if ( typeof types === "object" ) { + + // ( types-object [, selector] ) + for ( type in types ) { + this.off( type, selector, types[ type ] ); + } + return this; + } + if ( selector === false || typeof selector === "function" ) { + + // ( types [, fn] ) + fn = selector; + selector = undefined; + } + if ( fn === false ) { + fn = returnFalse; + } + return this.each( function() { + jQuery.event.remove( this, types, fn, selector ); + } ); + } +} ); + + +var + + // Support: IE <=10 - 11, Edge 12 - 13 only + // In IE/Edge using regex groups here causes severe slowdowns. + // See https://connect.microsoft.com/IE/feedback/details/1736512/ + rnoInnerhtml = /\s*$/g; + +// Prefer a tbody over its parent table for containing new rows +function manipulationTarget( elem, content ) { + if ( nodeName( elem, "table" ) && + nodeName( content.nodeType !== 11 ? content : content.firstChild, "tr" ) ) { + + return jQuery( elem ).children( "tbody" )[ 0 ] || elem; + } + + return elem; +} + +// Replace/restore the type attribute of script elements for safe DOM manipulation +function disableScript( elem ) { + elem.type = ( elem.getAttribute( "type" ) !== null ) + "/" + elem.type; + return elem; +} +function restoreScript( elem ) { + if ( ( elem.type || "" ).slice( 0, 5 ) === "true/" ) { + elem.type = elem.type.slice( 5 ); + } else { + elem.removeAttribute( "type" ); + } + + return elem; +} + +function cloneCopyEvent( src, dest ) { + var i, l, type, pdataOld, udataOld, udataCur, events; + + if ( dest.nodeType !== 1 ) { + return; + } + + // 1. Copy private data: events, handlers, etc. + if ( dataPriv.hasData( src ) ) { + pdataOld = dataPriv.get( src ); + events = pdataOld.events; + + if ( events ) { + dataPriv.remove( dest, "handle events" ); + + for ( type in events ) { + for ( i = 0, l = events[ type ].length; i < l; i++ ) { + jQuery.event.add( dest, type, events[ type ][ i ] ); + } + } + } + } + + // 2. Copy user data + if ( dataUser.hasData( src ) ) { + udataOld = dataUser.access( src ); + udataCur = jQuery.extend( {}, udataOld ); + + dataUser.set( dest, udataCur ); + } +} + +// Fix IE bugs, see support tests +function fixInput( src, dest ) { + var nodeName = dest.nodeName.toLowerCase(); + + // Fails to persist the checked state of a cloned checkbox or radio button. + if ( nodeName === "input" && rcheckableType.test( src.type ) ) { + dest.checked = src.checked; + + // Fails to return the selected option to the default selected state when cloning options + } else if ( nodeName === "input" || nodeName === "textarea" ) { + dest.defaultValue = src.defaultValue; + } +} + +function domManip( collection, args, callback, ignored ) { + + // Flatten any nested arrays + args = flat( args ); + + var fragment, first, scripts, hasScripts, node, doc, + i = 0, + l = collection.length, + iNoClone = l - 1, + value = args[ 0 ], + valueIsFunction = isFunction( value ); + + // We can't cloneNode fragments that contain checked, in WebKit + if ( valueIsFunction || + ( l > 1 && typeof value === "string" && + !support.checkClone && rchecked.test( value ) ) ) { + return collection.each( function( index ) { + var self = collection.eq( index ); + if ( valueIsFunction ) { + args[ 0 ] = value.call( this, index, self.html() ); + } + domManip( self, args, callback, ignored ); + } ); + } + + if ( l ) { + fragment = buildFragment( args, collection[ 0 ].ownerDocument, false, collection, ignored ); + first = fragment.firstChild; + + if ( fragment.childNodes.length === 1 ) { + fragment = first; + } + + // Require either new content or an interest in ignored elements to invoke the callback + if ( first || ignored ) { + scripts = jQuery.map( getAll( fragment, "script" ), disableScript ); + hasScripts = scripts.length; + + // Use the original fragment for the last item + // instead of the first because it can end up + // being emptied incorrectly in certain situations (#8070). + for ( ; i < l; i++ ) { + node = fragment; + + if ( i !== iNoClone ) { + node = jQuery.clone( node, true, true ); + + // Keep references to cloned scripts for later restoration + if ( hasScripts ) { + + // Support: Android <=4.0 only, PhantomJS 1 only + // push.apply(_, arraylike) throws on ancient WebKit + jQuery.merge( scripts, getAll( node, "script" ) ); + } + } + + callback.call( collection[ i ], node, i ); + } + + if ( hasScripts ) { + doc = scripts[ scripts.length - 1 ].ownerDocument; + + // Reenable scripts + jQuery.map( scripts, restoreScript ); + + // Evaluate executable scripts on first document insertion + for ( i = 0; i < hasScripts; i++ ) { + node = scripts[ i ]; + if ( rscriptType.test( node.type || "" ) && + !dataPriv.access( node, "globalEval" ) && + jQuery.contains( doc, node ) ) { + + if ( node.src && ( node.type || "" ).toLowerCase() !== "module" ) { + + // Optional AJAX dependency, but won't run scripts if not present + if ( jQuery._evalUrl && !node.noModule ) { + jQuery._evalUrl( node.src, { + nonce: node.nonce || node.getAttribute( "nonce" ) + }, doc ); + } + } else { + DOMEval( node.textContent.replace( rcleanScript, "" ), node, doc ); + } + } + } + } + } + } + + return collection; +} + +function remove( elem, selector, keepData ) { + var node, + nodes = selector ? jQuery.filter( selector, elem ) : elem, + i = 0; + + for ( ; ( node = nodes[ i ] ) != null; i++ ) { + if ( !keepData && node.nodeType === 1 ) { + jQuery.cleanData( getAll( node ) ); + } + + if ( node.parentNode ) { + if ( keepData && isAttached( node ) ) { + setGlobalEval( getAll( node, "script" ) ); + } + node.parentNode.removeChild( node ); + } + } + + return elem; +} + +jQuery.extend( { + htmlPrefilter: function( html ) { + return html; + }, + + clone: function( elem, dataAndEvents, deepDataAndEvents ) { + var i, l, srcElements, destElements, + clone = elem.cloneNode( true ), + inPage = isAttached( elem ); + + // Fix IE cloning issues + if ( !support.noCloneChecked && ( elem.nodeType === 1 || elem.nodeType === 11 ) && + !jQuery.isXMLDoc( elem ) ) { + + // We eschew Sizzle here for performance reasons: https://jsperf.com/getall-vs-sizzle/2 + destElements = getAll( clone ); + srcElements = getAll( elem ); + + for ( i = 0, l = srcElements.length; i < l; i++ ) { + fixInput( srcElements[ i ], destElements[ i ] ); + } + } + + // Copy the events from the original to the clone + if ( dataAndEvents ) { + if ( deepDataAndEvents ) { + srcElements = srcElements || getAll( elem ); + destElements = destElements || getAll( clone ); + + for ( i = 0, l = srcElements.length; i < l; i++ ) { + cloneCopyEvent( srcElements[ i ], destElements[ i ] ); + } + } else { + cloneCopyEvent( elem, clone ); + } + } + + // Preserve script evaluation history + destElements = getAll( clone, "script" ); + if ( destElements.length > 0 ) { + setGlobalEval( destElements, !inPage && getAll( elem, "script" ) ); + } + + // Return the cloned set + return clone; + }, + + cleanData: function( elems ) { + var data, elem, type, + special = jQuery.event.special, + i = 0; + + for ( ; ( elem = elems[ i ] ) !== undefined; i++ ) { + if ( acceptData( elem ) ) { + if ( ( data = elem[ dataPriv.expando ] ) ) { + if ( data.events ) { + for ( type in data.events ) { + if ( special[ type ] ) { + jQuery.event.remove( elem, type ); + + // This is a shortcut to avoid jQuery.event.remove's overhead + } else { + jQuery.removeEvent( elem, type, data.handle ); + } + } + } + + // Support: Chrome <=35 - 45+ + // Assign undefined instead of using delete, see Data#remove + elem[ dataPriv.expando ] = undefined; + } + if ( elem[ dataUser.expando ] ) { + + // Support: Chrome <=35 - 45+ + // Assign undefined instead of using delete, see Data#remove + elem[ dataUser.expando ] = undefined; + } + } + } + } +} ); + +jQuery.fn.extend( { + detach: function( selector ) { + return remove( this, selector, true ); + }, + + remove: function( selector ) { + return remove( this, selector ); + }, + + text: function( value ) { + return access( this, function( value ) { + return value === undefined ? + jQuery.text( this ) : + this.empty().each( function() { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + this.textContent = value; + } + } ); + }, null, value, arguments.length ); + }, + + append: function() { + return domManip( this, arguments, function( elem ) { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + var target = manipulationTarget( this, elem ); + target.appendChild( elem ); + } + } ); + }, + + prepend: function() { + return domManip( this, arguments, function( elem ) { + if ( this.nodeType === 1 || this.nodeType === 11 || this.nodeType === 9 ) { + var target = manipulationTarget( this, elem ); + target.insertBefore( elem, target.firstChild ); + } + } ); + }, + + before: function() { + return domManip( this, arguments, function( elem ) { + if ( this.parentNode ) { + this.parentNode.insertBefore( elem, this ); + } + } ); + }, + + after: function() { + return domManip( this, arguments, function( elem ) { + if ( this.parentNode ) { + this.parentNode.insertBefore( elem, this.nextSibling ); + } + } ); + }, + + empty: function() { + var elem, + i = 0; + + for ( ; ( elem = this[ i ] ) != null; i++ ) { + if ( elem.nodeType === 1 ) { + + // Prevent memory leaks + jQuery.cleanData( getAll( elem, false ) ); + + // Remove any remaining nodes + elem.textContent = ""; + } + } + + return this; + }, + + clone: function( dataAndEvents, deepDataAndEvents ) { + dataAndEvents = dataAndEvents == null ? false : dataAndEvents; + deepDataAndEvents = deepDataAndEvents == null ? dataAndEvents : deepDataAndEvents; + + return this.map( function() { + return jQuery.clone( this, dataAndEvents, deepDataAndEvents ); + } ); + }, + + html: function( value ) { + return access( this, function( value ) { + var elem = this[ 0 ] || {}, + i = 0, + l = this.length; + + if ( value === undefined && elem.nodeType === 1 ) { + return elem.innerHTML; + } + + // See if we can take a shortcut and just use innerHTML + if ( typeof value === "string" && !rnoInnerhtml.test( value ) && + !wrapMap[ ( rtagName.exec( value ) || [ "", "" ] )[ 1 ].toLowerCase() ] ) { + + value = jQuery.htmlPrefilter( value ); + + try { + for ( ; i < l; i++ ) { + elem = this[ i ] || {}; + + // Remove element nodes and prevent memory leaks + if ( elem.nodeType === 1 ) { + jQuery.cleanData( getAll( elem, false ) ); + elem.innerHTML = value; + } + } + + elem = 0; + + // If using innerHTML throws an exception, use the fallback method + } catch ( e ) {} + } + + if ( elem ) { + this.empty().append( value ); + } + }, null, value, arguments.length ); + }, + + replaceWith: function() { + var ignored = []; + + // Make the changes, replacing each non-ignored context element with the new content + return domManip( this, arguments, function( elem ) { + var parent = this.parentNode; + + if ( jQuery.inArray( this, ignored ) < 0 ) { + jQuery.cleanData( getAll( this ) ); + if ( parent ) { + parent.replaceChild( elem, this ); + } + } + + // Force callback invocation + }, ignored ); + } +} ); + +jQuery.each( { + appendTo: "append", + prependTo: "prepend", + insertBefore: "before", + insertAfter: "after", + replaceAll: "replaceWith" +}, function( name, original ) { + jQuery.fn[ name ] = function( selector ) { + var elems, + ret = [], + insert = jQuery( selector ), + last = insert.length - 1, + i = 0; + + for ( ; i <= last; i++ ) { + elems = i === last ? this : this.clone( true ); + jQuery( insert[ i ] )[ original ]( elems ); + + // Support: Android <=4.0 only, PhantomJS 1 only + // .get() because push.apply(_, arraylike) throws on ancient WebKit + push.apply( ret, elems.get() ); + } + + return this.pushStack( ret ); + }; +} ); +var rnumnonpx = new RegExp( "^(" + pnum + ")(?!px)[a-z%]+$", "i" ); + +var getStyles = function( elem ) { + + // Support: IE <=11 only, Firefox <=30 (#15098, #14150) + // IE throws on elements created in popups + // FF meanwhile throws on frame elements through "defaultView.getComputedStyle" + var view = elem.ownerDocument.defaultView; + + if ( !view || !view.opener ) { + view = window; + } + + return view.getComputedStyle( elem ); + }; + +var swap = function( elem, options, callback ) { + var ret, name, + old = {}; + + // Remember the old values, and insert the new ones + for ( name in options ) { + old[ name ] = elem.style[ name ]; + elem.style[ name ] = options[ name ]; + } + + ret = callback.call( elem ); + + // Revert the old values + for ( name in options ) { + elem.style[ name ] = old[ name ]; + } + + return ret; +}; + + +var rboxStyle = new RegExp( cssExpand.join( "|" ), "i" ); + + + +( function() { + + // Executing both pixelPosition & boxSizingReliable tests require only one layout + // so they're executed at the same time to save the second computation. + function computeStyleTests() { + + // This is a singleton, we need to execute it only once + if ( !div ) { + return; + } + + container.style.cssText = "position:absolute;left:-11111px;width:60px;" + + "margin-top:1px;padding:0;border:0"; + div.style.cssText = + "position:relative;display:block;box-sizing:border-box;overflow:scroll;" + + "margin:auto;border:1px;padding:1px;" + + "width:60%;top:1%"; + documentElement.appendChild( container ).appendChild( div ); + + var divStyle = window.getComputedStyle( div ); + pixelPositionVal = divStyle.top !== "1%"; + + // Support: Android 4.0 - 4.3 only, Firefox <=3 - 44 + reliableMarginLeftVal = roundPixelMeasures( divStyle.marginLeft ) === 12; + + // Support: Android 4.0 - 4.3 only, Safari <=9.1 - 10.1, iOS <=7.0 - 9.3 + // Some styles come back with percentage values, even though they shouldn't + div.style.right = "60%"; + pixelBoxStylesVal = roundPixelMeasures( divStyle.right ) === 36; + + // Support: IE 9 - 11 only + // Detect misreporting of content dimensions for box-sizing:border-box elements + boxSizingReliableVal = roundPixelMeasures( divStyle.width ) === 36; + + // Support: IE 9 only + // Detect overflow:scroll screwiness (gh-3699) + // Support: Chrome <=64 + // Don't get tricked when zoom affects offsetWidth (gh-4029) + div.style.position = "absolute"; + scrollboxSizeVal = roundPixelMeasures( div.offsetWidth / 3 ) === 12; + + documentElement.removeChild( container ); + + // Nullify the div so it wouldn't be stored in the memory and + // it will also be a sign that checks already performed + div = null; + } + + function roundPixelMeasures( measure ) { + return Math.round( parseFloat( measure ) ); + } + + var pixelPositionVal, boxSizingReliableVal, scrollboxSizeVal, pixelBoxStylesVal, + reliableTrDimensionsVal, reliableMarginLeftVal, + container = document.createElement( "div" ), + div = document.createElement( "div" ); + + // Finish early in limited (non-browser) environments + if ( !div.style ) { + return; + } + + // Support: IE <=9 - 11 only + // Style of cloned element affects source element cloned (#8908) + div.style.backgroundClip = "content-box"; + div.cloneNode( true ).style.backgroundClip = ""; + support.clearCloneStyle = div.style.backgroundClip === "content-box"; + + jQuery.extend( support, { + boxSizingReliable: function() { + computeStyleTests(); + return boxSizingReliableVal; + }, + pixelBoxStyles: function() { + computeStyleTests(); + return pixelBoxStylesVal; + }, + pixelPosition: function() { + computeStyleTests(); + return pixelPositionVal; + }, + reliableMarginLeft: function() { + computeStyleTests(); + return reliableMarginLeftVal; + }, + scrollboxSize: function() { + computeStyleTests(); + return scrollboxSizeVal; + }, + + // Support: IE 9 - 11+, Edge 15 - 18+ + // IE/Edge misreport `getComputedStyle` of table rows with width/height + // set in CSS while `offset*` properties report correct values. + // Behavior in IE 9 is more subtle than in newer versions & it passes + // some versions of this test; make sure not to make it pass there! + reliableTrDimensions: function() { + var table, tr, trChild, trStyle; + if ( reliableTrDimensionsVal == null ) { + table = document.createElement( "table" ); + tr = document.createElement( "tr" ); + trChild = document.createElement( "div" ); + + table.style.cssText = "position:absolute;left:-11111px"; + tr.style.height = "1px"; + trChild.style.height = "9px"; + + documentElement + .appendChild( table ) + .appendChild( tr ) + .appendChild( trChild ); + + trStyle = window.getComputedStyle( tr ); + reliableTrDimensionsVal = parseInt( trStyle.height ) > 3; + + documentElement.removeChild( table ); + } + return reliableTrDimensionsVal; + } + } ); +} )(); + + +function curCSS( elem, name, computed ) { + var width, minWidth, maxWidth, ret, + + // Support: Firefox 51+ + // Retrieving style before computed somehow + // fixes an issue with getting wrong values + // on detached elements + style = elem.style; + + computed = computed || getStyles( elem ); + + // getPropertyValue is needed for: + // .css('filter') (IE 9 only, #12537) + // .css('--customProperty) (#3144) + if ( computed ) { + ret = computed.getPropertyValue( name ) || computed[ name ]; + + if ( ret === "" && !isAttached( elem ) ) { + ret = jQuery.style( elem, name ); + } + + // A tribute to the "awesome hack by Dean Edwards" + // Android Browser returns percentage for some values, + // but width seems to be reliably pixels. + // This is against the CSSOM draft spec: + // https://drafts.csswg.org/cssom/#resolved-values + if ( !support.pixelBoxStyles() && rnumnonpx.test( ret ) && rboxStyle.test( name ) ) { + + // Remember the original values + width = style.width; + minWidth = style.minWidth; + maxWidth = style.maxWidth; + + // Put in the new values to get a computed value out + style.minWidth = style.maxWidth = style.width = ret; + ret = computed.width; + + // Revert the changed values + style.width = width; + style.minWidth = minWidth; + style.maxWidth = maxWidth; + } + } + + return ret !== undefined ? + + // Support: IE <=9 - 11 only + // IE returns zIndex value as an integer. + ret + "" : + ret; +} + + +function addGetHookIf( conditionFn, hookFn ) { + + // Define the hook, we'll check on the first run if it's really needed. + return { + get: function() { + if ( conditionFn() ) { + + // Hook not needed (or it's not possible to use it due + // to missing dependency), remove it. + delete this.get; + return; + } + + // Hook needed; redefine it so that the support test is not executed again. + return ( this.get = hookFn ).apply( this, arguments ); + } + }; +} + + +var cssPrefixes = [ "Webkit", "Moz", "ms" ], + emptyStyle = document.createElement( "div" ).style, + vendorProps = {}; + +// Return a vendor-prefixed property or undefined +function vendorPropName( name ) { + + // Check for vendor prefixed names + var capName = name[ 0 ].toUpperCase() + name.slice( 1 ), + i = cssPrefixes.length; + + while ( i-- ) { + name = cssPrefixes[ i ] + capName; + if ( name in emptyStyle ) { + return name; + } + } +} + +// Return a potentially-mapped jQuery.cssProps or vendor prefixed property +function finalPropName( name ) { + var final = jQuery.cssProps[ name ] || vendorProps[ name ]; + + if ( final ) { + return final; + } + if ( name in emptyStyle ) { + return name; + } + return vendorProps[ name ] = vendorPropName( name ) || name; +} + + +var + + // Swappable if display is none or starts with table + // except "table", "table-cell", or "table-caption" + // See here for display values: https://developer.mozilla.org/en-US/docs/CSS/display + rdisplayswap = /^(none|table(?!-c[ea]).+)/, + rcustomProp = /^--/, + cssShow = { position: "absolute", visibility: "hidden", display: "block" }, + cssNormalTransform = { + letterSpacing: "0", + fontWeight: "400" + }; + +function setPositiveNumber( _elem, value, subtract ) { + + // Any relative (+/-) values have already been + // normalized at this point + var matches = rcssNum.exec( value ); + return matches ? + + // Guard against undefined "subtract", e.g., when used as in cssHooks + Math.max( 0, matches[ 2 ] - ( subtract || 0 ) ) + ( matches[ 3 ] || "px" ) : + value; +} + +function boxModelAdjustment( elem, dimension, box, isBorderBox, styles, computedVal ) { + var i = dimension === "width" ? 1 : 0, + extra = 0, + delta = 0; + + // Adjustment may not be necessary + if ( box === ( isBorderBox ? "border" : "content" ) ) { + return 0; + } + + for ( ; i < 4; i += 2 ) { + + // Both box models exclude margin + if ( box === "margin" ) { + delta += jQuery.css( elem, box + cssExpand[ i ], true, styles ); + } + + // If we get here with a content-box, we're seeking "padding" or "border" or "margin" + if ( !isBorderBox ) { + + // Add padding + delta += jQuery.css( elem, "padding" + cssExpand[ i ], true, styles ); + + // For "border" or "margin", add border + if ( box !== "padding" ) { + delta += jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + + // But still keep track of it otherwise + } else { + extra += jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + } + + // If we get here with a border-box (content + padding + border), we're seeking "content" or + // "padding" or "margin" + } else { + + // For "content", subtract padding + if ( box === "content" ) { + delta -= jQuery.css( elem, "padding" + cssExpand[ i ], true, styles ); + } + + // For "content" or "padding", subtract border + if ( box !== "margin" ) { + delta -= jQuery.css( elem, "border" + cssExpand[ i ] + "Width", true, styles ); + } + } + } + + // Account for positive content-box scroll gutter when requested by providing computedVal + if ( !isBorderBox && computedVal >= 0 ) { + + // offsetWidth/offsetHeight is a rounded sum of content, padding, scroll gutter, and border + // Assuming integer scroll gutter, subtract the rest and round down + delta += Math.max( 0, Math.ceil( + elem[ "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ) ] - + computedVal - + delta - + extra - + 0.5 + + // If offsetWidth/offsetHeight is unknown, then we can't determine content-box scroll gutter + // Use an explicit zero to avoid NaN (gh-3964) + ) ) || 0; + } + + return delta; +} + +function getWidthOrHeight( elem, dimension, extra ) { + + // Start with computed style + var styles = getStyles( elem ), + + // To avoid forcing a reflow, only fetch boxSizing if we need it (gh-4322). + // Fake content-box until we know it's needed to know the true value. + boxSizingNeeded = !support.boxSizingReliable() || extra, + isBorderBox = boxSizingNeeded && + jQuery.css( elem, "boxSizing", false, styles ) === "border-box", + valueIsBorderBox = isBorderBox, + + val = curCSS( elem, dimension, styles ), + offsetProp = "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ); + + // Support: Firefox <=54 + // Return a confounding non-pixel value or feign ignorance, as appropriate. + if ( rnumnonpx.test( val ) ) { + if ( !extra ) { + return val; + } + val = "auto"; + } + + + // Support: IE 9 - 11 only + // Use offsetWidth/offsetHeight for when box sizing is unreliable. + // In those cases, the computed value can be trusted to be border-box. + if ( ( !support.boxSizingReliable() && isBorderBox || + + // Support: IE 10 - 11+, Edge 15 - 18+ + // IE/Edge misreport `getComputedStyle` of table rows with width/height + // set in CSS while `offset*` properties report correct values. + // Interestingly, in some cases IE 9 doesn't suffer from this issue. + !support.reliableTrDimensions() && nodeName( elem, "tr" ) || + + // Fall back to offsetWidth/offsetHeight when value is "auto" + // This happens for inline elements with no explicit setting (gh-3571) + val === "auto" || + + // Support: Android <=4.1 - 4.3 only + // Also use offsetWidth/offsetHeight for misreported inline dimensions (gh-3602) + !parseFloat( val ) && jQuery.css( elem, "display", false, styles ) === "inline" ) && + + // Make sure the element is visible & connected + elem.getClientRects().length ) { + + isBorderBox = jQuery.css( elem, "boxSizing", false, styles ) === "border-box"; + + // Where available, offsetWidth/offsetHeight approximate border box dimensions. + // Where not available (e.g., SVG), assume unreliable box-sizing and interpret the + // retrieved value as a content box dimension. + valueIsBorderBox = offsetProp in elem; + if ( valueIsBorderBox ) { + val = elem[ offsetProp ]; + } + } + + // Normalize "" and auto + val = parseFloat( val ) || 0; + + // Adjust for the element's box model + return ( val + + boxModelAdjustment( + elem, + dimension, + extra || ( isBorderBox ? "border" : "content" ), + valueIsBorderBox, + styles, + + // Provide the current computed size to request scroll gutter calculation (gh-3589) + val + ) + ) + "px"; +} + +jQuery.extend( { + + // Add in style property hooks for overriding the default + // behavior of getting and setting a style property + cssHooks: { + opacity: { + get: function( elem, computed ) { + if ( computed ) { + + // We should always get a number back from opacity + var ret = curCSS( elem, "opacity" ); + return ret === "" ? "1" : ret; + } + } + } + }, + + // Don't automatically add "px" to these possibly-unitless properties + cssNumber: { + "animationIterationCount": true, + "columnCount": true, + "fillOpacity": true, + "flexGrow": true, + "flexShrink": true, + "fontWeight": true, + "gridArea": true, + "gridColumn": true, + "gridColumnEnd": true, + "gridColumnStart": true, + "gridRow": true, + "gridRowEnd": true, + "gridRowStart": true, + "lineHeight": true, + "opacity": true, + "order": true, + "orphans": true, + "widows": true, + "zIndex": true, + "zoom": true + }, + + // Add in properties whose names you wish to fix before + // setting or getting the value + cssProps: {}, + + // Get and set the style property on a DOM Node + style: function( elem, name, value, extra ) { + + // Don't set styles on text and comment nodes + if ( !elem || elem.nodeType === 3 || elem.nodeType === 8 || !elem.style ) { + return; + } + + // Make sure that we're working with the right name + var ret, type, hooks, + origName = camelCase( name ), + isCustomProp = rcustomProp.test( name ), + style = elem.style; + + // Make sure that we're working with the right name. We don't + // want to query the value if it is a CSS custom property + // since they are user-defined. + if ( !isCustomProp ) { + name = finalPropName( origName ); + } + + // Gets hook for the prefixed version, then unprefixed version + hooks = jQuery.cssHooks[ name ] || jQuery.cssHooks[ origName ]; + + // Check if we're setting a value + if ( value !== undefined ) { + type = typeof value; + + // Convert "+=" or "-=" to relative numbers (#7345) + if ( type === "string" && ( ret = rcssNum.exec( value ) ) && ret[ 1 ] ) { + value = adjustCSS( elem, name, ret ); + + // Fixes bug #9237 + type = "number"; + } + + // Make sure that null and NaN values aren't set (#7116) + if ( value == null || value !== value ) { + return; + } + + // If a number was passed in, add the unit (except for certain CSS properties) + // The isCustomProp check can be removed in jQuery 4.0 when we only auto-append + // "px" to a few hardcoded values. + if ( type === "number" && !isCustomProp ) { + value += ret && ret[ 3 ] || ( jQuery.cssNumber[ origName ] ? "" : "px" ); + } + + // background-* props affect original clone's values + if ( !support.clearCloneStyle && value === "" && name.indexOf( "background" ) === 0 ) { + style[ name ] = "inherit"; + } + + // If a hook was provided, use that value, otherwise just set the specified value + if ( !hooks || !( "set" in hooks ) || + ( value = hooks.set( elem, value, extra ) ) !== undefined ) { + + if ( isCustomProp ) { + style.setProperty( name, value ); + } else { + style[ name ] = value; + } + } + + } else { + + // If a hook was provided get the non-computed value from there + if ( hooks && "get" in hooks && + ( ret = hooks.get( elem, false, extra ) ) !== undefined ) { + + return ret; + } + + // Otherwise just get the value from the style object + return style[ name ]; + } + }, + + css: function( elem, name, extra, styles ) { + var val, num, hooks, + origName = camelCase( name ), + isCustomProp = rcustomProp.test( name ); + + // Make sure that we're working with the right name. We don't + // want to modify the value if it is a CSS custom property + // since they are user-defined. + if ( !isCustomProp ) { + name = finalPropName( origName ); + } + + // Try prefixed name followed by the unprefixed name + hooks = jQuery.cssHooks[ name ] || jQuery.cssHooks[ origName ]; + + // If a hook was provided get the computed value from there + if ( hooks && "get" in hooks ) { + val = hooks.get( elem, true, extra ); + } + + // Otherwise, if a way to get the computed value exists, use that + if ( val === undefined ) { + val = curCSS( elem, name, styles ); + } + + // Convert "normal" to computed value + if ( val === "normal" && name in cssNormalTransform ) { + val = cssNormalTransform[ name ]; + } + + // Make numeric if forced or a qualifier was provided and val looks numeric + if ( extra === "" || extra ) { + num = parseFloat( val ); + return extra === true || isFinite( num ) ? num || 0 : val; + } + + return val; + } +} ); + +jQuery.each( [ "height", "width" ], function( _i, dimension ) { + jQuery.cssHooks[ dimension ] = { + get: function( elem, computed, extra ) { + if ( computed ) { + + // Certain elements can have dimension info if we invisibly show them + // but it must have a current display style that would benefit + return rdisplayswap.test( jQuery.css( elem, "display" ) ) && + + // Support: Safari 8+ + // Table columns in Safari have non-zero offsetWidth & zero + // getBoundingClientRect().width unless display is changed. + // Support: IE <=11 only + // Running getBoundingClientRect on a disconnected node + // in IE throws an error. + ( !elem.getClientRects().length || !elem.getBoundingClientRect().width ) ? + swap( elem, cssShow, function() { + return getWidthOrHeight( elem, dimension, extra ); + } ) : + getWidthOrHeight( elem, dimension, extra ); + } + }, + + set: function( elem, value, extra ) { + var matches, + styles = getStyles( elem ), + + // Only read styles.position if the test has a chance to fail + // to avoid forcing a reflow. + scrollboxSizeBuggy = !support.scrollboxSize() && + styles.position === "absolute", + + // To avoid forcing a reflow, only fetch boxSizing if we need it (gh-3991) + boxSizingNeeded = scrollboxSizeBuggy || extra, + isBorderBox = boxSizingNeeded && + jQuery.css( elem, "boxSizing", false, styles ) === "border-box", + subtract = extra ? + boxModelAdjustment( + elem, + dimension, + extra, + isBorderBox, + styles + ) : + 0; + + // Account for unreliable border-box dimensions by comparing offset* to computed and + // faking a content-box to get border and padding (gh-3699) + if ( isBorderBox && scrollboxSizeBuggy ) { + subtract -= Math.ceil( + elem[ "offset" + dimension[ 0 ].toUpperCase() + dimension.slice( 1 ) ] - + parseFloat( styles[ dimension ] ) - + boxModelAdjustment( elem, dimension, "border", false, styles ) - + 0.5 + ); + } + + // Convert to pixels if value adjustment is needed + if ( subtract && ( matches = rcssNum.exec( value ) ) && + ( matches[ 3 ] || "px" ) !== "px" ) { + + elem.style[ dimension ] = value; + value = jQuery.css( elem, dimension ); + } + + return setPositiveNumber( elem, value, subtract ); + } + }; +} ); + +jQuery.cssHooks.marginLeft = addGetHookIf( support.reliableMarginLeft, + function( elem, computed ) { + if ( computed ) { + return ( parseFloat( curCSS( elem, "marginLeft" ) ) || + elem.getBoundingClientRect().left - + swap( elem, { marginLeft: 0 }, function() { + return elem.getBoundingClientRect().left; + } ) + ) + "px"; + } + } +); + +// These hooks are used by animate to expand properties +jQuery.each( { + margin: "", + padding: "", + border: "Width" +}, function( prefix, suffix ) { + jQuery.cssHooks[ prefix + suffix ] = { + expand: function( value ) { + var i = 0, + expanded = {}, + + // Assumes a single number if not a string + parts = typeof value === "string" ? value.split( " " ) : [ value ]; + + for ( ; i < 4; i++ ) { + expanded[ prefix + cssExpand[ i ] + suffix ] = + parts[ i ] || parts[ i - 2 ] || parts[ 0 ]; + } + + return expanded; + } + }; + + if ( prefix !== "margin" ) { + jQuery.cssHooks[ prefix + suffix ].set = setPositiveNumber; + } +} ); + +jQuery.fn.extend( { + css: function( name, value ) { + return access( this, function( elem, name, value ) { + var styles, len, + map = {}, + i = 0; + + if ( Array.isArray( name ) ) { + styles = getStyles( elem ); + len = name.length; + + for ( ; i < len; i++ ) { + map[ name[ i ] ] = jQuery.css( elem, name[ i ], false, styles ); + } + + return map; + } + + return value !== undefined ? + jQuery.style( elem, name, value ) : + jQuery.css( elem, name ); + }, name, value, arguments.length > 1 ); + } +} ); + + +function Tween( elem, options, prop, end, easing ) { + return new Tween.prototype.init( elem, options, prop, end, easing ); +} +jQuery.Tween = Tween; + +Tween.prototype = { + constructor: Tween, + init: function( elem, options, prop, end, easing, unit ) { + this.elem = elem; + this.prop = prop; + this.easing = easing || jQuery.easing._default; + this.options = options; + this.start = this.now = this.cur(); + this.end = end; + this.unit = unit || ( jQuery.cssNumber[ prop ] ? "" : "px" ); + }, + cur: function() { + var hooks = Tween.propHooks[ this.prop ]; + + return hooks && hooks.get ? + hooks.get( this ) : + Tween.propHooks._default.get( this ); + }, + run: function( percent ) { + var eased, + hooks = Tween.propHooks[ this.prop ]; + + if ( this.options.duration ) { + this.pos = eased = jQuery.easing[ this.easing ]( + percent, this.options.duration * percent, 0, 1, this.options.duration + ); + } else { + this.pos = eased = percent; + } + this.now = ( this.end - this.start ) * eased + this.start; + + if ( this.options.step ) { + this.options.step.call( this.elem, this.now, this ); + } + + if ( hooks && hooks.set ) { + hooks.set( this ); + } else { + Tween.propHooks._default.set( this ); + } + return this; + } +}; + +Tween.prototype.init.prototype = Tween.prototype; + +Tween.propHooks = { + _default: { + get: function( tween ) { + var result; + + // Use a property on the element directly when it is not a DOM element, + // or when there is no matching style property that exists. + if ( tween.elem.nodeType !== 1 || + tween.elem[ tween.prop ] != null && tween.elem.style[ tween.prop ] == null ) { + return tween.elem[ tween.prop ]; + } + + // Passing an empty string as a 3rd parameter to .css will automatically + // attempt a parseFloat and fallback to a string if the parse fails. + // Simple values such as "10px" are parsed to Float; + // complex values such as "rotate(1rad)" are returned as-is. + result = jQuery.css( tween.elem, tween.prop, "" ); + + // Empty strings, null, undefined and "auto" are converted to 0. + return !result || result === "auto" ? 0 : result; + }, + set: function( tween ) { + + // Use step hook for back compat. + // Use cssHook if its there. + // Use .style if available and use plain properties where available. + if ( jQuery.fx.step[ tween.prop ] ) { + jQuery.fx.step[ tween.prop ]( tween ); + } else if ( tween.elem.nodeType === 1 && ( + jQuery.cssHooks[ tween.prop ] || + tween.elem.style[ finalPropName( tween.prop ) ] != null ) ) { + jQuery.style( tween.elem, tween.prop, tween.now + tween.unit ); + } else { + tween.elem[ tween.prop ] = tween.now; + } + } + } +}; + +// Support: IE <=9 only +// Panic based approach to setting things on disconnected nodes +Tween.propHooks.scrollTop = Tween.propHooks.scrollLeft = { + set: function( tween ) { + if ( tween.elem.nodeType && tween.elem.parentNode ) { + tween.elem[ tween.prop ] = tween.now; + } + } +}; + +jQuery.easing = { + linear: function( p ) { + return p; + }, + swing: function( p ) { + return 0.5 - Math.cos( p * Math.PI ) / 2; + }, + _default: "swing" +}; + +jQuery.fx = Tween.prototype.init; + +// Back compat <1.8 extension point +jQuery.fx.step = {}; + + + + +var + fxNow, inProgress, + rfxtypes = /^(?:toggle|show|hide)$/, + rrun = /queueHooks$/; + +function schedule() { + if ( inProgress ) { + if ( document.hidden === false && window.requestAnimationFrame ) { + window.requestAnimationFrame( schedule ); + } else { + window.setTimeout( schedule, jQuery.fx.interval ); + } + + jQuery.fx.tick(); + } +} + +// Animations created synchronously will run synchronously +function createFxNow() { + window.setTimeout( function() { + fxNow = undefined; + } ); + return ( fxNow = Date.now() ); +} + +// Generate parameters to create a standard animation +function genFx( type, includeWidth ) { + var which, + i = 0, + attrs = { height: type }; + + // If we include width, step value is 1 to do all cssExpand values, + // otherwise step value is 2 to skip over Left and Right + includeWidth = includeWidth ? 1 : 0; + for ( ; i < 4; i += 2 - includeWidth ) { + which = cssExpand[ i ]; + attrs[ "margin" + which ] = attrs[ "padding" + which ] = type; + } + + if ( includeWidth ) { + attrs.opacity = attrs.width = type; + } + + return attrs; +} + +function createTween( value, prop, animation ) { + var tween, + collection = ( Animation.tweeners[ prop ] || [] ).concat( Animation.tweeners[ "*" ] ), + index = 0, + length = collection.length; + for ( ; index < length; index++ ) { + if ( ( tween = collection[ index ].call( animation, prop, value ) ) ) { + + // We're done with this property + return tween; + } + } +} + +function defaultPrefilter( elem, props, opts ) { + var prop, value, toggle, hooks, oldfire, propTween, restoreDisplay, display, + isBox = "width" in props || "height" in props, + anim = this, + orig = {}, + style = elem.style, + hidden = elem.nodeType && isHiddenWithinTree( elem ), + dataShow = dataPriv.get( elem, "fxshow" ); + + // Queue-skipping animations hijack the fx hooks + if ( !opts.queue ) { + hooks = jQuery._queueHooks( elem, "fx" ); + if ( hooks.unqueued == null ) { + hooks.unqueued = 0; + oldfire = hooks.empty.fire; + hooks.empty.fire = function() { + if ( !hooks.unqueued ) { + oldfire(); + } + }; + } + hooks.unqueued++; + + anim.always( function() { + + // Ensure the complete handler is called before this completes + anim.always( function() { + hooks.unqueued--; + if ( !jQuery.queue( elem, "fx" ).length ) { + hooks.empty.fire(); + } + } ); + } ); + } + + // Detect show/hide animations + for ( prop in props ) { + value = props[ prop ]; + if ( rfxtypes.test( value ) ) { + delete props[ prop ]; + toggle = toggle || value === "toggle"; + if ( value === ( hidden ? "hide" : "show" ) ) { + + // Pretend to be hidden if this is a "show" and + // there is still data from a stopped show/hide + if ( value === "show" && dataShow && dataShow[ prop ] !== undefined ) { + hidden = true; + + // Ignore all other no-op show/hide data + } else { + continue; + } + } + orig[ prop ] = dataShow && dataShow[ prop ] || jQuery.style( elem, prop ); + } + } + + // Bail out if this is a no-op like .hide().hide() + propTween = !jQuery.isEmptyObject( props ); + if ( !propTween && jQuery.isEmptyObject( orig ) ) { + return; + } + + // Restrict "overflow" and "display" styles during box animations + if ( isBox && elem.nodeType === 1 ) { + + // Support: IE <=9 - 11, Edge 12 - 15 + // Record all 3 overflow attributes because IE does not infer the shorthand + // from identically-valued overflowX and overflowY and Edge just mirrors + // the overflowX value there. + opts.overflow = [ style.overflow, style.overflowX, style.overflowY ]; + + // Identify a display type, preferring old show/hide data over the CSS cascade + restoreDisplay = dataShow && dataShow.display; + if ( restoreDisplay == null ) { + restoreDisplay = dataPriv.get( elem, "display" ); + } + display = jQuery.css( elem, "display" ); + if ( display === "none" ) { + if ( restoreDisplay ) { + display = restoreDisplay; + } else { + + // Get nonempty value(s) by temporarily forcing visibility + showHide( [ elem ], true ); + restoreDisplay = elem.style.display || restoreDisplay; + display = jQuery.css( elem, "display" ); + showHide( [ elem ] ); + } + } + + // Animate inline elements as inline-block + if ( display === "inline" || display === "inline-block" && restoreDisplay != null ) { + if ( jQuery.css( elem, "float" ) === "none" ) { + + // Restore the original display value at the end of pure show/hide animations + if ( !propTween ) { + anim.done( function() { + style.display = restoreDisplay; + } ); + if ( restoreDisplay == null ) { + display = style.display; + restoreDisplay = display === "none" ? "" : display; + } + } + style.display = "inline-block"; + } + } + } + + if ( opts.overflow ) { + style.overflow = "hidden"; + anim.always( function() { + style.overflow = opts.overflow[ 0 ]; + style.overflowX = opts.overflow[ 1 ]; + style.overflowY = opts.overflow[ 2 ]; + } ); + } + + // Implement show/hide animations + propTween = false; + for ( prop in orig ) { + + // General show/hide setup for this element animation + if ( !propTween ) { + if ( dataShow ) { + if ( "hidden" in dataShow ) { + hidden = dataShow.hidden; + } + } else { + dataShow = dataPriv.access( elem, "fxshow", { display: restoreDisplay } ); + } + + // Store hidden/visible for toggle so `.stop().toggle()` "reverses" + if ( toggle ) { + dataShow.hidden = !hidden; + } + + // Show elements before animating them + if ( hidden ) { + showHide( [ elem ], true ); + } + + /* eslint-disable no-loop-func */ + + anim.done( function() { + + /* eslint-enable no-loop-func */ + + // The final step of a "hide" animation is actually hiding the element + if ( !hidden ) { + showHide( [ elem ] ); + } + dataPriv.remove( elem, "fxshow" ); + for ( prop in orig ) { + jQuery.style( elem, prop, orig[ prop ] ); + } + } ); + } + + // Per-property setup + propTween = createTween( hidden ? dataShow[ prop ] : 0, prop, anim ); + if ( !( prop in dataShow ) ) { + dataShow[ prop ] = propTween.start; + if ( hidden ) { + propTween.end = propTween.start; + propTween.start = 0; + } + } + } +} + +function propFilter( props, specialEasing ) { + var index, name, easing, value, hooks; + + // camelCase, specialEasing and expand cssHook pass + for ( index in props ) { + name = camelCase( index ); + easing = specialEasing[ name ]; + value = props[ index ]; + if ( Array.isArray( value ) ) { + easing = value[ 1 ]; + value = props[ index ] = value[ 0 ]; + } + + if ( index !== name ) { + props[ name ] = value; + delete props[ index ]; + } + + hooks = jQuery.cssHooks[ name ]; + if ( hooks && "expand" in hooks ) { + value = hooks.expand( value ); + delete props[ name ]; + + // Not quite $.extend, this won't overwrite existing keys. + // Reusing 'index' because we have the correct "name" + for ( index in value ) { + if ( !( index in props ) ) { + props[ index ] = value[ index ]; + specialEasing[ index ] = easing; + } + } + } else { + specialEasing[ name ] = easing; + } + } +} + +function Animation( elem, properties, options ) { + var result, + stopped, + index = 0, + length = Animation.prefilters.length, + deferred = jQuery.Deferred().always( function() { + + // Don't match elem in the :animated selector + delete tick.elem; + } ), + tick = function() { + if ( stopped ) { + return false; + } + var currentTime = fxNow || createFxNow(), + remaining = Math.max( 0, animation.startTime + animation.duration - currentTime ), + + // Support: Android 2.3 only + // Archaic crash bug won't allow us to use `1 - ( 0.5 || 0 )` (#12497) + temp = remaining / animation.duration || 0, + percent = 1 - temp, + index = 0, + length = animation.tweens.length; + + for ( ; index < length; index++ ) { + animation.tweens[ index ].run( percent ); + } + + deferred.notifyWith( elem, [ animation, percent, remaining ] ); + + // If there's more to do, yield + if ( percent < 1 && length ) { + return remaining; + } + + // If this was an empty animation, synthesize a final progress notification + if ( !length ) { + deferred.notifyWith( elem, [ animation, 1, 0 ] ); + } + + // Resolve the animation and report its conclusion + deferred.resolveWith( elem, [ animation ] ); + return false; + }, + animation = deferred.promise( { + elem: elem, + props: jQuery.extend( {}, properties ), + opts: jQuery.extend( true, { + specialEasing: {}, + easing: jQuery.easing._default + }, options ), + originalProperties: properties, + originalOptions: options, + startTime: fxNow || createFxNow(), + duration: options.duration, + tweens: [], + createTween: function( prop, end ) { + var tween = jQuery.Tween( elem, animation.opts, prop, end, + animation.opts.specialEasing[ prop ] || animation.opts.easing ); + animation.tweens.push( tween ); + return tween; + }, + stop: function( gotoEnd ) { + var index = 0, + + // If we are going to the end, we want to run all the tweens + // otherwise we skip this part + length = gotoEnd ? animation.tweens.length : 0; + if ( stopped ) { + return this; + } + stopped = true; + for ( ; index < length; index++ ) { + animation.tweens[ index ].run( 1 ); + } + + // Resolve when we played the last frame; otherwise, reject + if ( gotoEnd ) { + deferred.notifyWith( elem, [ animation, 1, 0 ] ); + deferred.resolveWith( elem, [ animation, gotoEnd ] ); + } else { + deferred.rejectWith( elem, [ animation, gotoEnd ] ); + } + return this; + } + } ), + props = animation.props; + + propFilter( props, animation.opts.specialEasing ); + + for ( ; index < length; index++ ) { + result = Animation.prefilters[ index ].call( animation, elem, props, animation.opts ); + if ( result ) { + if ( isFunction( result.stop ) ) { + jQuery._queueHooks( animation.elem, animation.opts.queue ).stop = + result.stop.bind( result ); + } + return result; + } + } + + jQuery.map( props, createTween, animation ); + + if ( isFunction( animation.opts.start ) ) { + animation.opts.start.call( elem, animation ); + } + + // Attach callbacks from options + animation + .progress( animation.opts.progress ) + .done( animation.opts.done, animation.opts.complete ) + .fail( animation.opts.fail ) + .always( animation.opts.always ); + + jQuery.fx.timer( + jQuery.extend( tick, { + elem: elem, + anim: animation, + queue: animation.opts.queue + } ) + ); + + return animation; +} + +jQuery.Animation = jQuery.extend( Animation, { + + tweeners: { + "*": [ function( prop, value ) { + var tween = this.createTween( prop, value ); + adjustCSS( tween.elem, prop, rcssNum.exec( value ), tween ); + return tween; + } ] + }, + + tweener: function( props, callback ) { + if ( isFunction( props ) ) { + callback = props; + props = [ "*" ]; + } else { + props = props.match( rnothtmlwhite ); + } + + var prop, + index = 0, + length = props.length; + + for ( ; index < length; index++ ) { + prop = props[ index ]; + Animation.tweeners[ prop ] = Animation.tweeners[ prop ] || []; + Animation.tweeners[ prop ].unshift( callback ); + } + }, + + prefilters: [ defaultPrefilter ], + + prefilter: function( callback, prepend ) { + if ( prepend ) { + Animation.prefilters.unshift( callback ); + } else { + Animation.prefilters.push( callback ); + } + } +} ); + +jQuery.speed = function( speed, easing, fn ) { + var opt = speed && typeof speed === "object" ? jQuery.extend( {}, speed ) : { + complete: fn || !fn && easing || + isFunction( speed ) && speed, + duration: speed, + easing: fn && easing || easing && !isFunction( easing ) && easing + }; + + // Go to the end state if fx are off + if ( jQuery.fx.off ) { + opt.duration = 0; + + } else { + if ( typeof opt.duration !== "number" ) { + if ( opt.duration in jQuery.fx.speeds ) { + opt.duration = jQuery.fx.speeds[ opt.duration ]; + + } else { + opt.duration = jQuery.fx.speeds._default; + } + } + } + + // Normalize opt.queue - true/undefined/null -> "fx" + if ( opt.queue == null || opt.queue === true ) { + opt.queue = "fx"; + } + + // Queueing + opt.old = opt.complete; + + opt.complete = function() { + if ( isFunction( opt.old ) ) { + opt.old.call( this ); + } + + if ( opt.queue ) { + jQuery.dequeue( this, opt.queue ); + } + }; + + return opt; +}; + +jQuery.fn.extend( { + fadeTo: function( speed, to, easing, callback ) { + + // Show any hidden elements after setting opacity to 0 + return this.filter( isHiddenWithinTree ).css( "opacity", 0 ).show() + + // Animate to the value specified + .end().animate( { opacity: to }, speed, easing, callback ); + }, + animate: function( prop, speed, easing, callback ) { + var empty = jQuery.isEmptyObject( prop ), + optall = jQuery.speed( speed, easing, callback ), + doAnimation = function() { + + // Operate on a copy of prop so per-property easing won't be lost + var anim = Animation( this, jQuery.extend( {}, prop ), optall ); + + // Empty animations, or finishing resolves immediately + if ( empty || dataPriv.get( this, "finish" ) ) { + anim.stop( true ); + } + }; + doAnimation.finish = doAnimation; + + return empty || optall.queue === false ? + this.each( doAnimation ) : + this.queue( optall.queue, doAnimation ); + }, + stop: function( type, clearQueue, gotoEnd ) { + var stopQueue = function( hooks ) { + var stop = hooks.stop; + delete hooks.stop; + stop( gotoEnd ); + }; + + if ( typeof type !== "string" ) { + gotoEnd = clearQueue; + clearQueue = type; + type = undefined; + } + if ( clearQueue ) { + this.queue( type || "fx", [] ); + } + + return this.each( function() { + var dequeue = true, + index = type != null && type + "queueHooks", + timers = jQuery.timers, + data = dataPriv.get( this ); + + if ( index ) { + if ( data[ index ] && data[ index ].stop ) { + stopQueue( data[ index ] ); + } + } else { + for ( index in data ) { + if ( data[ index ] && data[ index ].stop && rrun.test( index ) ) { + stopQueue( data[ index ] ); + } + } + } + + for ( index = timers.length; index--; ) { + if ( timers[ index ].elem === this && + ( type == null || timers[ index ].queue === type ) ) { + + timers[ index ].anim.stop( gotoEnd ); + dequeue = false; + timers.splice( index, 1 ); + } + } + + // Start the next in the queue if the last step wasn't forced. + // Timers currently will call their complete callbacks, which + // will dequeue but only if they were gotoEnd. + if ( dequeue || !gotoEnd ) { + jQuery.dequeue( this, type ); + } + } ); + }, + finish: function( type ) { + if ( type !== false ) { + type = type || "fx"; + } + return this.each( function() { + var index, + data = dataPriv.get( this ), + queue = data[ type + "queue" ], + hooks = data[ type + "queueHooks" ], + timers = jQuery.timers, + length = queue ? queue.length : 0; + + // Enable finishing flag on private data + data.finish = true; + + // Empty the queue first + jQuery.queue( this, type, [] ); + + if ( hooks && hooks.stop ) { + hooks.stop.call( this, true ); + } + + // Look for any active animations, and finish them + for ( index = timers.length; index--; ) { + if ( timers[ index ].elem === this && timers[ index ].queue === type ) { + timers[ index ].anim.stop( true ); + timers.splice( index, 1 ); + } + } + + // Look for any animations in the old queue and finish them + for ( index = 0; index < length; index++ ) { + if ( queue[ index ] && queue[ index ].finish ) { + queue[ index ].finish.call( this ); + } + } + + // Turn off finishing flag + delete data.finish; + } ); + } +} ); + +jQuery.each( [ "toggle", "show", "hide" ], function( _i, name ) { + var cssFn = jQuery.fn[ name ]; + jQuery.fn[ name ] = function( speed, easing, callback ) { + return speed == null || typeof speed === "boolean" ? + cssFn.apply( this, arguments ) : + this.animate( genFx( name, true ), speed, easing, callback ); + }; +} ); + +// Generate shortcuts for custom animations +jQuery.each( { + slideDown: genFx( "show" ), + slideUp: genFx( "hide" ), + slideToggle: genFx( "toggle" ), + fadeIn: { opacity: "show" }, + fadeOut: { opacity: "hide" }, + fadeToggle: { opacity: "toggle" } +}, function( name, props ) { + jQuery.fn[ name ] = function( speed, easing, callback ) { + return this.animate( props, speed, easing, callback ); + }; +} ); + +jQuery.timers = []; +jQuery.fx.tick = function() { + var timer, + i = 0, + timers = jQuery.timers; + + fxNow = Date.now(); + + for ( ; i < timers.length; i++ ) { + timer = timers[ i ]; + + // Run the timer and safely remove it when done (allowing for external removal) + if ( !timer() && timers[ i ] === timer ) { + timers.splice( i--, 1 ); + } + } + + if ( !timers.length ) { + jQuery.fx.stop(); + } + fxNow = undefined; +}; + +jQuery.fx.timer = function( timer ) { + jQuery.timers.push( timer ); + jQuery.fx.start(); +}; + +jQuery.fx.interval = 13; +jQuery.fx.start = function() { + if ( inProgress ) { + return; + } + + inProgress = true; + schedule(); +}; + +jQuery.fx.stop = function() { + inProgress = null; +}; + +jQuery.fx.speeds = { + slow: 600, + fast: 200, + + // Default speed + _default: 400 +}; + + +// Based off of the plugin by Clint Helfers, with permission. +// https://web.archive.org/web/20100324014747/http://blindsignals.com/index.php/2009/07/jquery-delay/ +jQuery.fn.delay = function( time, type ) { + time = jQuery.fx ? jQuery.fx.speeds[ time ] || time : time; + type = type || "fx"; + + return this.queue( type, function( next, hooks ) { + var timeout = window.setTimeout( next, time ); + hooks.stop = function() { + window.clearTimeout( timeout ); + }; + } ); +}; + + +( function() { + var input = document.createElement( "input" ), + select = document.createElement( "select" ), + opt = select.appendChild( document.createElement( "option" ) ); + + input.type = "checkbox"; + + // Support: Android <=4.3 only + // Default value for a checkbox should be "on" + support.checkOn = input.value !== ""; + + // Support: IE <=11 only + // Must access selectedIndex to make default options select + support.optSelected = opt.selected; + + // Support: IE <=11 only + // An input loses its value after becoming a radio + input = document.createElement( "input" ); + input.value = "t"; + input.type = "radio"; + support.radioValue = input.value === "t"; +} )(); + + +var boolHook, + attrHandle = jQuery.expr.attrHandle; + +jQuery.fn.extend( { + attr: function( name, value ) { + return access( this, jQuery.attr, name, value, arguments.length > 1 ); + }, + + removeAttr: function( name ) { + return this.each( function() { + jQuery.removeAttr( this, name ); + } ); + } +} ); + +jQuery.extend( { + attr: function( elem, name, value ) { + var ret, hooks, + nType = elem.nodeType; + + // Don't get/set attributes on text, comment and attribute nodes + if ( nType === 3 || nType === 8 || nType === 2 ) { + return; + } + + // Fallback to prop when attributes are not supported + if ( typeof elem.getAttribute === "undefined" ) { + return jQuery.prop( elem, name, value ); + } + + // Attribute hooks are determined by the lowercase version + // Grab necessary hook if one is defined + if ( nType !== 1 || !jQuery.isXMLDoc( elem ) ) { + hooks = jQuery.attrHooks[ name.toLowerCase() ] || + ( jQuery.expr.match.bool.test( name ) ? boolHook : undefined ); + } + + if ( value !== undefined ) { + if ( value === null ) { + jQuery.removeAttr( elem, name ); + return; + } + + if ( hooks && "set" in hooks && + ( ret = hooks.set( elem, value, name ) ) !== undefined ) { + return ret; + } + + elem.setAttribute( name, value + "" ); + return value; + } + + if ( hooks && "get" in hooks && ( ret = hooks.get( elem, name ) ) !== null ) { + return ret; + } + + ret = jQuery.find.attr( elem, name ); + + // Non-existent attributes return null, we normalize to undefined + return ret == null ? undefined : ret; + }, + + attrHooks: { + type: { + set: function( elem, value ) { + if ( !support.radioValue && value === "radio" && + nodeName( elem, "input" ) ) { + var val = elem.value; + elem.setAttribute( "type", value ); + if ( val ) { + elem.value = val; + } + return value; + } + } + } + }, + + removeAttr: function( elem, value ) { + var name, + i = 0, + + // Attribute names can contain non-HTML whitespace characters + // https://html.spec.whatwg.org/multipage/syntax.html#attributes-2 + attrNames = value && value.match( rnothtmlwhite ); + + if ( attrNames && elem.nodeType === 1 ) { + while ( ( name = attrNames[ i++ ] ) ) { + elem.removeAttribute( name ); + } + } + } +} ); + +// Hooks for boolean attributes +boolHook = { + set: function( elem, value, name ) { + if ( value === false ) { + + // Remove boolean attributes when set to false + jQuery.removeAttr( elem, name ); + } else { + elem.setAttribute( name, name ); + } + return name; + } +}; + +jQuery.each( jQuery.expr.match.bool.source.match( /\w+/g ), function( _i, name ) { + var getter = attrHandle[ name ] || jQuery.find.attr; + + attrHandle[ name ] = function( elem, name, isXML ) { + var ret, handle, + lowercaseName = name.toLowerCase(); + + if ( !isXML ) { + + // Avoid an infinite loop by temporarily removing this function from the getter + handle = attrHandle[ lowercaseName ]; + attrHandle[ lowercaseName ] = ret; + ret = getter( elem, name, isXML ) != null ? + lowercaseName : + null; + attrHandle[ lowercaseName ] = handle; + } + return ret; + }; +} ); + + + + +var rfocusable = /^(?:input|select|textarea|button)$/i, + rclickable = /^(?:a|area)$/i; + +jQuery.fn.extend( { + prop: function( name, value ) { + return access( this, jQuery.prop, name, value, arguments.length > 1 ); + }, + + removeProp: function( name ) { + return this.each( function() { + delete this[ jQuery.propFix[ name ] || name ]; + } ); + } +} ); + +jQuery.extend( { + prop: function( elem, name, value ) { + var ret, hooks, + nType = elem.nodeType; + + // Don't get/set properties on text, comment and attribute nodes + if ( nType === 3 || nType === 8 || nType === 2 ) { + return; + } + + if ( nType !== 1 || !jQuery.isXMLDoc( elem ) ) { + + // Fix name and attach hooks + name = jQuery.propFix[ name ] || name; + hooks = jQuery.propHooks[ name ]; + } + + if ( value !== undefined ) { + if ( hooks && "set" in hooks && + ( ret = hooks.set( elem, value, name ) ) !== undefined ) { + return ret; + } + + return ( elem[ name ] = value ); + } + + if ( hooks && "get" in hooks && ( ret = hooks.get( elem, name ) ) !== null ) { + return ret; + } + + return elem[ name ]; + }, + + propHooks: { + tabIndex: { + get: function( elem ) { + + // Support: IE <=9 - 11 only + // elem.tabIndex doesn't always return the + // correct value when it hasn't been explicitly set + // https://web.archive.org/web/20141116233347/http://fluidproject.org/blog/2008/01/09/getting-setting-and-removing-tabindex-values-with-javascript/ + // Use proper attribute retrieval(#12072) + var tabindex = jQuery.find.attr( elem, "tabindex" ); + + if ( tabindex ) { + return parseInt( tabindex, 10 ); + } + + if ( + rfocusable.test( elem.nodeName ) || + rclickable.test( elem.nodeName ) && + elem.href + ) { + return 0; + } + + return -1; + } + } + }, + + propFix: { + "for": "htmlFor", + "class": "className" + } +} ); + +// Support: IE <=11 only +// Accessing the selectedIndex property +// forces the browser to respect setting selected +// on the option +// The getter ensures a default option is selected +// when in an optgroup +// eslint rule "no-unused-expressions" is disabled for this code +// since it considers such accessions noop +if ( !support.optSelected ) { + jQuery.propHooks.selected = { + get: function( elem ) { + + /* eslint no-unused-expressions: "off" */ + + var parent = elem.parentNode; + if ( parent && parent.parentNode ) { + parent.parentNode.selectedIndex; + } + return null; + }, + set: function( elem ) { + + /* eslint no-unused-expressions: "off" */ + + var parent = elem.parentNode; + if ( parent ) { + parent.selectedIndex; + + if ( parent.parentNode ) { + parent.parentNode.selectedIndex; + } + } + } + }; +} + +jQuery.each( [ + "tabIndex", + "readOnly", + "maxLength", + "cellSpacing", + "cellPadding", + "rowSpan", + "colSpan", + "useMap", + "frameBorder", + "contentEditable" +], function() { + jQuery.propFix[ this.toLowerCase() ] = this; +} ); + + + + + // Strip and collapse whitespace according to HTML spec + // https://infra.spec.whatwg.org/#strip-and-collapse-ascii-whitespace + function stripAndCollapse( value ) { + var tokens = value.match( rnothtmlwhite ) || []; + return tokens.join( " " ); + } + + +function getClass( elem ) { + return elem.getAttribute && elem.getAttribute( "class" ) || ""; +} + +function classesToArray( value ) { + if ( Array.isArray( value ) ) { + return value; + } + if ( typeof value === "string" ) { + return value.match( rnothtmlwhite ) || []; + } + return []; +} + +jQuery.fn.extend( { + addClass: function( value ) { + var classes, elem, cur, curValue, clazz, j, finalValue, + i = 0; + + if ( isFunction( value ) ) { + return this.each( function( j ) { + jQuery( this ).addClass( value.call( this, j, getClass( this ) ) ); + } ); + } + + classes = classesToArray( value ); + + if ( classes.length ) { + while ( ( elem = this[ i++ ] ) ) { + curValue = getClass( elem ); + cur = elem.nodeType === 1 && ( " " + stripAndCollapse( curValue ) + " " ); + + if ( cur ) { + j = 0; + while ( ( clazz = classes[ j++ ] ) ) { + if ( cur.indexOf( " " + clazz + " " ) < 0 ) { + cur += clazz + " "; + } + } + + // Only assign if different to avoid unneeded rendering. + finalValue = stripAndCollapse( cur ); + if ( curValue !== finalValue ) { + elem.setAttribute( "class", finalValue ); + } + } + } + } + + return this; + }, + + removeClass: function( value ) { + var classes, elem, cur, curValue, clazz, j, finalValue, + i = 0; + + if ( isFunction( value ) ) { + return this.each( function( j ) { + jQuery( this ).removeClass( value.call( this, j, getClass( this ) ) ); + } ); + } + + if ( !arguments.length ) { + return this.attr( "class", "" ); + } + + classes = classesToArray( value ); + + if ( classes.length ) { + while ( ( elem = this[ i++ ] ) ) { + curValue = getClass( elem ); + + // This expression is here for better compressibility (see addClass) + cur = elem.nodeType === 1 && ( " " + stripAndCollapse( curValue ) + " " ); + + if ( cur ) { + j = 0; + while ( ( clazz = classes[ j++ ] ) ) { + + // Remove *all* instances + while ( cur.indexOf( " " + clazz + " " ) > -1 ) { + cur = cur.replace( " " + clazz + " ", " " ); + } + } + + // Only assign if different to avoid unneeded rendering. + finalValue = stripAndCollapse( cur ); + if ( curValue !== finalValue ) { + elem.setAttribute( "class", finalValue ); + } + } + } + } + + return this; + }, + + toggleClass: function( value, stateVal ) { + var type = typeof value, + isValidValue = type === "string" || Array.isArray( value ); + + if ( typeof stateVal === "boolean" && isValidValue ) { + return stateVal ? this.addClass( value ) : this.removeClass( value ); + } + + if ( isFunction( value ) ) { + return this.each( function( i ) { + jQuery( this ).toggleClass( + value.call( this, i, getClass( this ), stateVal ), + stateVal + ); + } ); + } + + return this.each( function() { + var className, i, self, classNames; + + if ( isValidValue ) { + + // Toggle individual class names + i = 0; + self = jQuery( this ); + classNames = classesToArray( value ); + + while ( ( className = classNames[ i++ ] ) ) { + + // Check each className given, space separated list + if ( self.hasClass( className ) ) { + self.removeClass( className ); + } else { + self.addClass( className ); + } + } + + // Toggle whole class name + } else if ( value === undefined || type === "boolean" ) { + className = getClass( this ); + if ( className ) { + + // Store className if set + dataPriv.set( this, "__className__", className ); + } + + // If the element has a class name or if we're passed `false`, + // then remove the whole classname (if there was one, the above saved it). + // Otherwise bring back whatever was previously saved (if anything), + // falling back to the empty string if nothing was stored. + if ( this.setAttribute ) { + this.setAttribute( "class", + className || value === false ? + "" : + dataPriv.get( this, "__className__" ) || "" + ); + } + } + } ); + }, + + hasClass: function( selector ) { + var className, elem, + i = 0; + + className = " " + selector + " "; + while ( ( elem = this[ i++ ] ) ) { + if ( elem.nodeType === 1 && + ( " " + stripAndCollapse( getClass( elem ) ) + " " ).indexOf( className ) > -1 ) { + return true; + } + } + + return false; + } +} ); + + + + +var rreturn = /\r/g; + +jQuery.fn.extend( { + val: function( value ) { + var hooks, ret, valueIsFunction, + elem = this[ 0 ]; + + if ( !arguments.length ) { + if ( elem ) { + hooks = jQuery.valHooks[ elem.type ] || + jQuery.valHooks[ elem.nodeName.toLowerCase() ]; + + if ( hooks && + "get" in hooks && + ( ret = hooks.get( elem, "value" ) ) !== undefined + ) { + return ret; + } + + ret = elem.value; + + // Handle most common string cases + if ( typeof ret === "string" ) { + return ret.replace( rreturn, "" ); + } + + // Handle cases where value is null/undef or number + return ret == null ? "" : ret; + } + + return; + } + + valueIsFunction = isFunction( value ); + + return this.each( function( i ) { + var val; + + if ( this.nodeType !== 1 ) { + return; + } + + if ( valueIsFunction ) { + val = value.call( this, i, jQuery( this ).val() ); + } else { + val = value; + } + + // Treat null/undefined as ""; convert numbers to string + if ( val == null ) { + val = ""; + + } else if ( typeof val === "number" ) { + val += ""; + + } else if ( Array.isArray( val ) ) { + val = jQuery.map( val, function( value ) { + return value == null ? "" : value + ""; + } ); + } + + hooks = jQuery.valHooks[ this.type ] || jQuery.valHooks[ this.nodeName.toLowerCase() ]; + + // If set returns undefined, fall back to normal setting + if ( !hooks || !( "set" in hooks ) || hooks.set( this, val, "value" ) === undefined ) { + this.value = val; + } + } ); + } +} ); + +jQuery.extend( { + valHooks: { + option: { + get: function( elem ) { + + var val = jQuery.find.attr( elem, "value" ); + return val != null ? + val : + + // Support: IE <=10 - 11 only + // option.text throws exceptions (#14686, #14858) + // Strip and collapse whitespace + // https://html.spec.whatwg.org/#strip-and-collapse-whitespace + stripAndCollapse( jQuery.text( elem ) ); + } + }, + select: { + get: function( elem ) { + var value, option, i, + options = elem.options, + index = elem.selectedIndex, + one = elem.type === "select-one", + values = one ? null : [], + max = one ? index + 1 : options.length; + + if ( index < 0 ) { + i = max; + + } else { + i = one ? index : 0; + } + + // Loop through all the selected options + for ( ; i < max; i++ ) { + option = options[ i ]; + + // Support: IE <=9 only + // IE8-9 doesn't update selected after form reset (#2551) + if ( ( option.selected || i === index ) && + + // Don't return options that are disabled or in a disabled optgroup + !option.disabled && + ( !option.parentNode.disabled || + !nodeName( option.parentNode, "optgroup" ) ) ) { + + // Get the specific value for the option + value = jQuery( option ).val(); + + // We don't need an array for one selects + if ( one ) { + return value; + } + + // Multi-Selects return an array + values.push( value ); + } + } + + return values; + }, + + set: function( elem, value ) { + var optionSet, option, + options = elem.options, + values = jQuery.makeArray( value ), + i = options.length; + + while ( i-- ) { + option = options[ i ]; + + /* eslint-disable no-cond-assign */ + + if ( option.selected = + jQuery.inArray( jQuery.valHooks.option.get( option ), values ) > -1 + ) { + optionSet = true; + } + + /* eslint-enable no-cond-assign */ + } + + // Force browsers to behave consistently when non-matching value is set + if ( !optionSet ) { + elem.selectedIndex = -1; + } + return values; + } + } + } +} ); + +// Radios and checkboxes getter/setter +jQuery.each( [ "radio", "checkbox" ], function() { + jQuery.valHooks[ this ] = { + set: function( elem, value ) { + if ( Array.isArray( value ) ) { + return ( elem.checked = jQuery.inArray( jQuery( elem ).val(), value ) > -1 ); + } + } + }; + if ( !support.checkOn ) { + jQuery.valHooks[ this ].get = function( elem ) { + return elem.getAttribute( "value" ) === null ? "on" : elem.value; + }; + } +} ); + + + + +// Return jQuery for attributes-only inclusion + + +support.focusin = "onfocusin" in window; + + +var rfocusMorph = /^(?:focusinfocus|focusoutblur)$/, + stopPropagationCallback = function( e ) { + e.stopPropagation(); + }; + +jQuery.extend( jQuery.event, { + + trigger: function( event, data, elem, onlyHandlers ) { + + var i, cur, tmp, bubbleType, ontype, handle, special, lastElement, + eventPath = [ elem || document ], + type = hasOwn.call( event, "type" ) ? event.type : event, + namespaces = hasOwn.call( event, "namespace" ) ? event.namespace.split( "." ) : []; + + cur = lastElement = tmp = elem = elem || document; + + // Don't do events on text and comment nodes + if ( elem.nodeType === 3 || elem.nodeType === 8 ) { + return; + } + + // focus/blur morphs to focusin/out; ensure we're not firing them right now + if ( rfocusMorph.test( type + jQuery.event.triggered ) ) { + return; + } + + if ( type.indexOf( "." ) > -1 ) { + + // Namespaced trigger; create a regexp to match event type in handle() + namespaces = type.split( "." ); + type = namespaces.shift(); + namespaces.sort(); + } + ontype = type.indexOf( ":" ) < 0 && "on" + type; + + // Caller can pass in a jQuery.Event object, Object, or just an event type string + event = event[ jQuery.expando ] ? + event : + new jQuery.Event( type, typeof event === "object" && event ); + + // Trigger bitmask: & 1 for native handlers; & 2 for jQuery (always true) + event.isTrigger = onlyHandlers ? 2 : 3; + event.namespace = namespaces.join( "." ); + event.rnamespace = event.namespace ? + new RegExp( "(^|\\.)" + namespaces.join( "\\.(?:.*\\.|)" ) + "(\\.|$)" ) : + null; + + // Clean up the event in case it is being reused + event.result = undefined; + if ( !event.target ) { + event.target = elem; + } + + // Clone any incoming data and prepend the event, creating the handler arg list + data = data == null ? + [ event ] : + jQuery.makeArray( data, [ event ] ); + + // Allow special events to draw outside the lines + special = jQuery.event.special[ type ] || {}; + if ( !onlyHandlers && special.trigger && special.trigger.apply( elem, data ) === false ) { + return; + } + + // Determine event propagation path in advance, per W3C events spec (#9951) + // Bubble up to document, then to window; watch for a global ownerDocument var (#9724) + if ( !onlyHandlers && !special.noBubble && !isWindow( elem ) ) { + + bubbleType = special.delegateType || type; + if ( !rfocusMorph.test( bubbleType + type ) ) { + cur = cur.parentNode; + } + for ( ; cur; cur = cur.parentNode ) { + eventPath.push( cur ); + tmp = cur; + } + + // Only add window if we got to document (e.g., not plain obj or detached DOM) + if ( tmp === ( elem.ownerDocument || document ) ) { + eventPath.push( tmp.defaultView || tmp.parentWindow || window ); + } + } + + // Fire handlers on the event path + i = 0; + while ( ( cur = eventPath[ i++ ] ) && !event.isPropagationStopped() ) { + lastElement = cur; + event.type = i > 1 ? + bubbleType : + special.bindType || type; + + // jQuery handler + handle = ( + dataPriv.get( cur, "events" ) || Object.create( null ) + )[ event.type ] && + dataPriv.get( cur, "handle" ); + if ( handle ) { + handle.apply( cur, data ); + } + + // Native handler + handle = ontype && cur[ ontype ]; + if ( handle && handle.apply && acceptData( cur ) ) { + event.result = handle.apply( cur, data ); + if ( event.result === false ) { + event.preventDefault(); + } + } + } + event.type = type; + + // If nobody prevented the default action, do it now + if ( !onlyHandlers && !event.isDefaultPrevented() ) { + + if ( ( !special._default || + special._default.apply( eventPath.pop(), data ) === false ) && + acceptData( elem ) ) { + + // Call a native DOM method on the target with the same name as the event. + // Don't do default actions on window, that's where global variables be (#6170) + if ( ontype && isFunction( elem[ type ] ) && !isWindow( elem ) ) { + + // Don't re-trigger an onFOO event when we call its FOO() method + tmp = elem[ ontype ]; + + if ( tmp ) { + elem[ ontype ] = null; + } + + // Prevent re-triggering of the same event, since we already bubbled it above + jQuery.event.triggered = type; + + if ( event.isPropagationStopped() ) { + lastElement.addEventListener( type, stopPropagationCallback ); + } + + elem[ type ](); + + if ( event.isPropagationStopped() ) { + lastElement.removeEventListener( type, stopPropagationCallback ); + } + + jQuery.event.triggered = undefined; + + if ( tmp ) { + elem[ ontype ] = tmp; + } + } + } + } + + return event.result; + }, + + // Piggyback on a donor event to simulate a different one + // Used only for `focus(in | out)` events + simulate: function( type, elem, event ) { + var e = jQuery.extend( + new jQuery.Event(), + event, + { + type: type, + isSimulated: true + } + ); + + jQuery.event.trigger( e, null, elem ); + } + +} ); + +jQuery.fn.extend( { + + trigger: function( type, data ) { + return this.each( function() { + jQuery.event.trigger( type, data, this ); + } ); + }, + triggerHandler: function( type, data ) { + var elem = this[ 0 ]; + if ( elem ) { + return jQuery.event.trigger( type, data, elem, true ); + } + } +} ); + + +// Support: Firefox <=44 +// Firefox doesn't have focus(in | out) events +// Related ticket - https://bugzilla.mozilla.org/show_bug.cgi?id=687787 +// +// Support: Chrome <=48 - 49, Safari <=9.0 - 9.1 +// focus(in | out) events fire after focus & blur events, +// which is spec violation - http://www.w3.org/TR/DOM-Level-3-Events/#events-focusevent-event-order +// Related ticket - https://bugs.chromium.org/p/chromium/issues/detail?id=449857 +if ( !support.focusin ) { + jQuery.each( { focus: "focusin", blur: "focusout" }, function( orig, fix ) { + + // Attach a single capturing handler on the document while someone wants focusin/focusout + var handler = function( event ) { + jQuery.event.simulate( fix, event.target, jQuery.event.fix( event ) ); + }; + + jQuery.event.special[ fix ] = { + setup: function() { + + // Handle: regular nodes (via `this.ownerDocument`), window + // (via `this.document`) & document (via `this`). + var doc = this.ownerDocument || this.document || this, + attaches = dataPriv.access( doc, fix ); + + if ( !attaches ) { + doc.addEventListener( orig, handler, true ); + } + dataPriv.access( doc, fix, ( attaches || 0 ) + 1 ); + }, + teardown: function() { + var doc = this.ownerDocument || this.document || this, + attaches = dataPriv.access( doc, fix ) - 1; + + if ( !attaches ) { + doc.removeEventListener( orig, handler, true ); + dataPriv.remove( doc, fix ); + + } else { + dataPriv.access( doc, fix, attaches ); + } + } + }; + } ); +} +var location = window.location; + +var nonce = { guid: Date.now() }; + +var rquery = ( /\?/ ); + + + +// Cross-browser xml parsing +jQuery.parseXML = function( data ) { + var xml; + if ( !data || typeof data !== "string" ) { + return null; + } + + // Support: IE 9 - 11 only + // IE throws on parseFromString with invalid input. + try { + xml = ( new window.DOMParser() ).parseFromString( data, "text/xml" ); + } catch ( e ) { + xml = undefined; + } + + if ( !xml || xml.getElementsByTagName( "parsererror" ).length ) { + jQuery.error( "Invalid XML: " + data ); + } + return xml; +}; + + +var + rbracket = /\[\]$/, + rCRLF = /\r?\n/g, + rsubmitterTypes = /^(?:submit|button|image|reset|file)$/i, + rsubmittable = /^(?:input|select|textarea|keygen)/i; + +function buildParams( prefix, obj, traditional, add ) { + var name; + + if ( Array.isArray( obj ) ) { + + // Serialize array item. + jQuery.each( obj, function( i, v ) { + if ( traditional || rbracket.test( prefix ) ) { + + // Treat each array item as a scalar. + add( prefix, v ); + + } else { + + // Item is non-scalar (array or object), encode its numeric index. + buildParams( + prefix + "[" + ( typeof v === "object" && v != null ? i : "" ) + "]", + v, + traditional, + add + ); + } + } ); + + } else if ( !traditional && toType( obj ) === "object" ) { + + // Serialize object item. + for ( name in obj ) { + buildParams( prefix + "[" + name + "]", obj[ name ], traditional, add ); + } + + } else { + + // Serialize scalar item. + add( prefix, obj ); + } +} + +// Serialize an array of form elements or a set of +// key/values into a query string +jQuery.param = function( a, traditional ) { + var prefix, + s = [], + add = function( key, valueOrFunction ) { + + // If value is a function, invoke it and use its return value + var value = isFunction( valueOrFunction ) ? + valueOrFunction() : + valueOrFunction; + + s[ s.length ] = encodeURIComponent( key ) + "=" + + encodeURIComponent( value == null ? "" : value ); + }; + + if ( a == null ) { + return ""; + } + + // If an array was passed in, assume that it is an array of form elements. + if ( Array.isArray( a ) || ( a.jquery && !jQuery.isPlainObject( a ) ) ) { + + // Serialize the form elements + jQuery.each( a, function() { + add( this.name, this.value ); + } ); + + } else { + + // If traditional, encode the "old" way (the way 1.3.2 or older + // did it), otherwise encode params recursively. + for ( prefix in a ) { + buildParams( prefix, a[ prefix ], traditional, add ); + } + } + + // Return the resulting serialization + return s.join( "&" ); +}; + +jQuery.fn.extend( { + serialize: function() { + return jQuery.param( this.serializeArray() ); + }, + serializeArray: function() { + return this.map( function() { + + // Can add propHook for "elements" to filter or add form elements + var elements = jQuery.prop( this, "elements" ); + return elements ? jQuery.makeArray( elements ) : this; + } ) + .filter( function() { + var type = this.type; + + // Use .is( ":disabled" ) so that fieldset[disabled] works + return this.name && !jQuery( this ).is( ":disabled" ) && + rsubmittable.test( this.nodeName ) && !rsubmitterTypes.test( type ) && + ( this.checked || !rcheckableType.test( type ) ); + } ) + .map( function( _i, elem ) { + var val = jQuery( this ).val(); + + if ( val == null ) { + return null; + } + + if ( Array.isArray( val ) ) { + return jQuery.map( val, function( val ) { + return { name: elem.name, value: val.replace( rCRLF, "\r\n" ) }; + } ); + } + + return { name: elem.name, value: val.replace( rCRLF, "\r\n" ) }; + } ).get(); + } +} ); + + +var + r20 = /%20/g, + rhash = /#.*$/, + rantiCache = /([?&])_=[^&]*/, + rheaders = /^(.*?):[ \t]*([^\r\n]*)$/mg, + + // #7653, #8125, #8152: local protocol detection + rlocalProtocol = /^(?:about|app|app-storage|.+-extension|file|res|widget):$/, + rnoContent = /^(?:GET|HEAD)$/, + rprotocol = /^\/\//, + + /* Prefilters + * 1) They are useful to introduce custom dataTypes (see ajax/jsonp.js for an example) + * 2) These are called: + * - BEFORE asking for a transport + * - AFTER param serialization (s.data is a string if s.processData is true) + * 3) key is the dataType + * 4) the catchall symbol "*" can be used + * 5) execution will start with transport dataType and THEN continue down to "*" if needed + */ + prefilters = {}, + + /* Transports bindings + * 1) key is the dataType + * 2) the catchall symbol "*" can be used + * 3) selection will start with transport dataType and THEN go to "*" if needed + */ + transports = {}, + + // Avoid comment-prolog char sequence (#10098); must appease lint and evade compression + allTypes = "*/".concat( "*" ), + + // Anchor tag for parsing the document origin + originAnchor = document.createElement( "a" ); + originAnchor.href = location.href; + +// Base "constructor" for jQuery.ajaxPrefilter and jQuery.ajaxTransport +function addToPrefiltersOrTransports( structure ) { + + // dataTypeExpression is optional and defaults to "*" + return function( dataTypeExpression, func ) { + + if ( typeof dataTypeExpression !== "string" ) { + func = dataTypeExpression; + dataTypeExpression = "*"; + } + + var dataType, + i = 0, + dataTypes = dataTypeExpression.toLowerCase().match( rnothtmlwhite ) || []; + + if ( isFunction( func ) ) { + + // For each dataType in the dataTypeExpression + while ( ( dataType = dataTypes[ i++ ] ) ) { + + // Prepend if requested + if ( dataType[ 0 ] === "+" ) { + dataType = dataType.slice( 1 ) || "*"; + ( structure[ dataType ] = structure[ dataType ] || [] ).unshift( func ); + + // Otherwise append + } else { + ( structure[ dataType ] = structure[ dataType ] || [] ).push( func ); + } + } + } + }; +} + +// Base inspection function for prefilters and transports +function inspectPrefiltersOrTransports( structure, options, originalOptions, jqXHR ) { + + var inspected = {}, + seekingTransport = ( structure === transports ); + + function inspect( dataType ) { + var selected; + inspected[ dataType ] = true; + jQuery.each( structure[ dataType ] || [], function( _, prefilterOrFactory ) { + var dataTypeOrTransport = prefilterOrFactory( options, originalOptions, jqXHR ); + if ( typeof dataTypeOrTransport === "string" && + !seekingTransport && !inspected[ dataTypeOrTransport ] ) { + + options.dataTypes.unshift( dataTypeOrTransport ); + inspect( dataTypeOrTransport ); + return false; + } else if ( seekingTransport ) { + return !( selected = dataTypeOrTransport ); + } + } ); + return selected; + } + + return inspect( options.dataTypes[ 0 ] ) || !inspected[ "*" ] && inspect( "*" ); +} + +// A special extend for ajax options +// that takes "flat" options (not to be deep extended) +// Fixes #9887 +function ajaxExtend( target, src ) { + var key, deep, + flatOptions = jQuery.ajaxSettings.flatOptions || {}; + + for ( key in src ) { + if ( src[ key ] !== undefined ) { + ( flatOptions[ key ] ? target : ( deep || ( deep = {} ) ) )[ key ] = src[ key ]; + } + } + if ( deep ) { + jQuery.extend( true, target, deep ); + } + + return target; +} + +/* Handles responses to an ajax request: + * - finds the right dataType (mediates between content-type and expected dataType) + * - returns the corresponding response + */ +function ajaxHandleResponses( s, jqXHR, responses ) { + + var ct, type, finalDataType, firstDataType, + contents = s.contents, + dataTypes = s.dataTypes; + + // Remove auto dataType and get content-type in the process + while ( dataTypes[ 0 ] === "*" ) { + dataTypes.shift(); + if ( ct === undefined ) { + ct = s.mimeType || jqXHR.getResponseHeader( "Content-Type" ); + } + } + + // Check if we're dealing with a known content-type + if ( ct ) { + for ( type in contents ) { + if ( contents[ type ] && contents[ type ].test( ct ) ) { + dataTypes.unshift( type ); + break; + } + } + } + + // Check to see if we have a response for the expected dataType + if ( dataTypes[ 0 ] in responses ) { + finalDataType = dataTypes[ 0 ]; + } else { + + // Try convertible dataTypes + for ( type in responses ) { + if ( !dataTypes[ 0 ] || s.converters[ type + " " + dataTypes[ 0 ] ] ) { + finalDataType = type; + break; + } + if ( !firstDataType ) { + firstDataType = type; + } + } + + // Or just use first one + finalDataType = finalDataType || firstDataType; + } + + // If we found a dataType + // We add the dataType to the list if needed + // and return the corresponding response + if ( finalDataType ) { + if ( finalDataType !== dataTypes[ 0 ] ) { + dataTypes.unshift( finalDataType ); + } + return responses[ finalDataType ]; + } +} + +/* Chain conversions given the request and the original response + * Also sets the responseXXX fields on the jqXHR instance + */ +function ajaxConvert( s, response, jqXHR, isSuccess ) { + var conv2, current, conv, tmp, prev, + converters = {}, + + // Work with a copy of dataTypes in case we need to modify it for conversion + dataTypes = s.dataTypes.slice(); + + // Create converters map with lowercased keys + if ( dataTypes[ 1 ] ) { + for ( conv in s.converters ) { + converters[ conv.toLowerCase() ] = s.converters[ conv ]; + } + } + + current = dataTypes.shift(); + + // Convert to each sequential dataType + while ( current ) { + + if ( s.responseFields[ current ] ) { + jqXHR[ s.responseFields[ current ] ] = response; + } + + // Apply the dataFilter if provided + if ( !prev && isSuccess && s.dataFilter ) { + response = s.dataFilter( response, s.dataType ); + } + + prev = current; + current = dataTypes.shift(); + + if ( current ) { + + // There's only work to do if current dataType is non-auto + if ( current === "*" ) { + + current = prev; + + // Convert response if prev dataType is non-auto and differs from current + } else if ( prev !== "*" && prev !== current ) { + + // Seek a direct converter + conv = converters[ prev + " " + current ] || converters[ "* " + current ]; + + // If none found, seek a pair + if ( !conv ) { + for ( conv2 in converters ) { + + // If conv2 outputs current + tmp = conv2.split( " " ); + if ( tmp[ 1 ] === current ) { + + // If prev can be converted to accepted input + conv = converters[ prev + " " + tmp[ 0 ] ] || + converters[ "* " + tmp[ 0 ] ]; + if ( conv ) { + + // Condense equivalence converters + if ( conv === true ) { + conv = converters[ conv2 ]; + + // Otherwise, insert the intermediate dataType + } else if ( converters[ conv2 ] !== true ) { + current = tmp[ 0 ]; + dataTypes.unshift( tmp[ 1 ] ); + } + break; + } + } + } + } + + // Apply converter (if not an equivalence) + if ( conv !== true ) { + + // Unless errors are allowed to bubble, catch and return them + if ( conv && s.throws ) { + response = conv( response ); + } else { + try { + response = conv( response ); + } catch ( e ) { + return { + state: "parsererror", + error: conv ? e : "No conversion from " + prev + " to " + current + }; + } + } + } + } + } + } + + return { state: "success", data: response }; +} + +jQuery.extend( { + + // Counter for holding the number of active queries + active: 0, + + // Last-Modified header cache for next request + lastModified: {}, + etag: {}, + + ajaxSettings: { + url: location.href, + type: "GET", + isLocal: rlocalProtocol.test( location.protocol ), + global: true, + processData: true, + async: true, + contentType: "application/x-www-form-urlencoded; charset=UTF-8", + + /* + timeout: 0, + data: null, + dataType: null, + username: null, + password: null, + cache: null, + throws: false, + traditional: false, + headers: {}, + */ + + accepts: { + "*": allTypes, + text: "text/plain", + html: "text/html", + xml: "application/xml, text/xml", + json: "application/json, text/javascript" + }, + + contents: { + xml: /\bxml\b/, + html: /\bhtml/, + json: /\bjson\b/ + }, + + responseFields: { + xml: "responseXML", + text: "responseText", + json: "responseJSON" + }, + + // Data converters + // Keys separate source (or catchall "*") and destination types with a single space + converters: { + + // Convert anything to text + "* text": String, + + // Text to html (true = no transformation) + "text html": true, + + // Evaluate text as a json expression + "text json": JSON.parse, + + // Parse text as xml + "text xml": jQuery.parseXML + }, + + // For options that shouldn't be deep extended: + // you can add your own custom options here if + // and when you create one that shouldn't be + // deep extended (see ajaxExtend) + flatOptions: { + url: true, + context: true + } + }, + + // Creates a full fledged settings object into target + // with both ajaxSettings and settings fields. + // If target is omitted, writes into ajaxSettings. + ajaxSetup: function( target, settings ) { + return settings ? + + // Building a settings object + ajaxExtend( ajaxExtend( target, jQuery.ajaxSettings ), settings ) : + + // Extending ajaxSettings + ajaxExtend( jQuery.ajaxSettings, target ); + }, + + ajaxPrefilter: addToPrefiltersOrTransports( prefilters ), + ajaxTransport: addToPrefiltersOrTransports( transports ), + + // Main method + ajax: function( url, options ) { + + // If url is an object, simulate pre-1.5 signature + if ( typeof url === "object" ) { + options = url; + url = undefined; + } + + // Force options to be an object + options = options || {}; + + var transport, + + // URL without anti-cache param + cacheURL, + + // Response headers + responseHeadersString, + responseHeaders, + + // timeout handle + timeoutTimer, + + // Url cleanup var + urlAnchor, + + // Request state (becomes false upon send and true upon completion) + completed, + + // To know if global events are to be dispatched + fireGlobals, + + // Loop variable + i, + + // uncached part of the url + uncached, + + // Create the final options object + s = jQuery.ajaxSetup( {}, options ), + + // Callbacks context + callbackContext = s.context || s, + + // Context for global events is callbackContext if it is a DOM node or jQuery collection + globalEventContext = s.context && + ( callbackContext.nodeType || callbackContext.jquery ) ? + jQuery( callbackContext ) : + jQuery.event, + + // Deferreds + deferred = jQuery.Deferred(), + completeDeferred = jQuery.Callbacks( "once memory" ), + + // Status-dependent callbacks + statusCode = s.statusCode || {}, + + // Headers (they are sent all at once) + requestHeaders = {}, + requestHeadersNames = {}, + + // Default abort message + strAbort = "canceled", + + // Fake xhr + jqXHR = { + readyState: 0, + + // Builds headers hashtable if needed + getResponseHeader: function( key ) { + var match; + if ( completed ) { + if ( !responseHeaders ) { + responseHeaders = {}; + while ( ( match = rheaders.exec( responseHeadersString ) ) ) { + responseHeaders[ match[ 1 ].toLowerCase() + " " ] = + ( responseHeaders[ match[ 1 ].toLowerCase() + " " ] || [] ) + .concat( match[ 2 ] ); + } + } + match = responseHeaders[ key.toLowerCase() + " " ]; + } + return match == null ? null : match.join( ", " ); + }, + + // Raw string + getAllResponseHeaders: function() { + return completed ? responseHeadersString : null; + }, + + // Caches the header + setRequestHeader: function( name, value ) { + if ( completed == null ) { + name = requestHeadersNames[ name.toLowerCase() ] = + requestHeadersNames[ name.toLowerCase() ] || name; + requestHeaders[ name ] = value; + } + return this; + }, + + // Overrides response content-type header + overrideMimeType: function( type ) { + if ( completed == null ) { + s.mimeType = type; + } + return this; + }, + + // Status-dependent callbacks + statusCode: function( map ) { + var code; + if ( map ) { + if ( completed ) { + + // Execute the appropriate callbacks + jqXHR.always( map[ jqXHR.status ] ); + } else { + + // Lazy-add the new callbacks in a way that preserves old ones + for ( code in map ) { + statusCode[ code ] = [ statusCode[ code ], map[ code ] ]; + } + } + } + return this; + }, + + // Cancel the request + abort: function( statusText ) { + var finalText = statusText || strAbort; + if ( transport ) { + transport.abort( finalText ); + } + done( 0, finalText ); + return this; + } + }; + + // Attach deferreds + deferred.promise( jqXHR ); + + // Add protocol if not provided (prefilters might expect it) + // Handle falsy url in the settings object (#10093: consistency with old signature) + // We also use the url parameter if available + s.url = ( ( url || s.url || location.href ) + "" ) + .replace( rprotocol, location.protocol + "//" ); + + // Alias method option to type as per ticket #12004 + s.type = options.method || options.type || s.method || s.type; + + // Extract dataTypes list + s.dataTypes = ( s.dataType || "*" ).toLowerCase().match( rnothtmlwhite ) || [ "" ]; + + // A cross-domain request is in order when the origin doesn't match the current origin. + if ( s.crossDomain == null ) { + urlAnchor = document.createElement( "a" ); + + // Support: IE <=8 - 11, Edge 12 - 15 + // IE throws exception on accessing the href property if url is malformed, + // e.g. http://example.com:80x/ + try { + urlAnchor.href = s.url; + + // Support: IE <=8 - 11 only + // Anchor's host property isn't correctly set when s.url is relative + urlAnchor.href = urlAnchor.href; + s.crossDomain = originAnchor.protocol + "//" + originAnchor.host !== + urlAnchor.protocol + "//" + urlAnchor.host; + } catch ( e ) { + + // If there is an error parsing the URL, assume it is crossDomain, + // it can be rejected by the transport if it is invalid + s.crossDomain = true; + } + } + + // Convert data if not already a string + if ( s.data && s.processData && typeof s.data !== "string" ) { + s.data = jQuery.param( s.data, s.traditional ); + } + + // Apply prefilters + inspectPrefiltersOrTransports( prefilters, s, options, jqXHR ); + + // If request was aborted inside a prefilter, stop there + if ( completed ) { + return jqXHR; + } + + // We can fire global events as of now if asked to + // Don't fire events if jQuery.event is undefined in an AMD-usage scenario (#15118) + fireGlobals = jQuery.event && s.global; + + // Watch for a new set of requests + if ( fireGlobals && jQuery.active++ === 0 ) { + jQuery.event.trigger( "ajaxStart" ); + } + + // Uppercase the type + s.type = s.type.toUpperCase(); + + // Determine if request has content + s.hasContent = !rnoContent.test( s.type ); + + // Save the URL in case we're toying with the If-Modified-Since + // and/or If-None-Match header later on + // Remove hash to simplify url manipulation + cacheURL = s.url.replace( rhash, "" ); + + // More options handling for requests with no content + if ( !s.hasContent ) { + + // Remember the hash so we can put it back + uncached = s.url.slice( cacheURL.length ); + + // If data is available and should be processed, append data to url + if ( s.data && ( s.processData || typeof s.data === "string" ) ) { + cacheURL += ( rquery.test( cacheURL ) ? "&" : "?" ) + s.data; + + // #9682: remove data so that it's not used in an eventual retry + delete s.data; + } + + // Add or update anti-cache param if needed + if ( s.cache === false ) { + cacheURL = cacheURL.replace( rantiCache, "$1" ); + uncached = ( rquery.test( cacheURL ) ? "&" : "?" ) + "_=" + ( nonce.guid++ ) + + uncached; + } + + // Put hash and anti-cache on the URL that will be requested (gh-1732) + s.url = cacheURL + uncached; + + // Change '%20' to '+' if this is encoded form body content (gh-2658) + } else if ( s.data && s.processData && + ( s.contentType || "" ).indexOf( "application/x-www-form-urlencoded" ) === 0 ) { + s.data = s.data.replace( r20, "+" ); + } + + // Set the If-Modified-Since and/or If-None-Match header, if in ifModified mode. + if ( s.ifModified ) { + if ( jQuery.lastModified[ cacheURL ] ) { + jqXHR.setRequestHeader( "If-Modified-Since", jQuery.lastModified[ cacheURL ] ); + } + if ( jQuery.etag[ cacheURL ] ) { + jqXHR.setRequestHeader( "If-None-Match", jQuery.etag[ cacheURL ] ); + } + } + + // Set the correct header, if data is being sent + if ( s.data && s.hasContent && s.contentType !== false || options.contentType ) { + jqXHR.setRequestHeader( "Content-Type", s.contentType ); + } + + // Set the Accepts header for the server, depending on the dataType + jqXHR.setRequestHeader( + "Accept", + s.dataTypes[ 0 ] && s.accepts[ s.dataTypes[ 0 ] ] ? + s.accepts[ s.dataTypes[ 0 ] ] + + ( s.dataTypes[ 0 ] !== "*" ? ", " + allTypes + "; q=0.01" : "" ) : + s.accepts[ "*" ] + ); + + // Check for headers option + for ( i in s.headers ) { + jqXHR.setRequestHeader( i, s.headers[ i ] ); + } + + // Allow custom headers/mimetypes and early abort + if ( s.beforeSend && + ( s.beforeSend.call( callbackContext, jqXHR, s ) === false || completed ) ) { + + // Abort if not done already and return + return jqXHR.abort(); + } + + // Aborting is no longer a cancellation + strAbort = "abort"; + + // Install callbacks on deferreds + completeDeferred.add( s.complete ); + jqXHR.done( s.success ); + jqXHR.fail( s.error ); + + // Get transport + transport = inspectPrefiltersOrTransports( transports, s, options, jqXHR ); + + // If no transport, we auto-abort + if ( !transport ) { + done( -1, "No Transport" ); + } else { + jqXHR.readyState = 1; + + // Send global event + if ( fireGlobals ) { + globalEventContext.trigger( "ajaxSend", [ jqXHR, s ] ); + } + + // If request was aborted inside ajaxSend, stop there + if ( completed ) { + return jqXHR; + } + + // Timeout + if ( s.async && s.timeout > 0 ) { + timeoutTimer = window.setTimeout( function() { + jqXHR.abort( "timeout" ); + }, s.timeout ); + } + + try { + completed = false; + transport.send( requestHeaders, done ); + } catch ( e ) { + + // Rethrow post-completion exceptions + if ( completed ) { + throw e; + } + + // Propagate others as results + done( -1, e ); + } + } + + // Callback for when everything is done + function done( status, nativeStatusText, responses, headers ) { + var isSuccess, success, error, response, modified, + statusText = nativeStatusText; + + // Ignore repeat invocations + if ( completed ) { + return; + } + + completed = true; + + // Clear timeout if it exists + if ( timeoutTimer ) { + window.clearTimeout( timeoutTimer ); + } + + // Dereference transport for early garbage collection + // (no matter how long the jqXHR object will be used) + transport = undefined; + + // Cache response headers + responseHeadersString = headers || ""; + + // Set readyState + jqXHR.readyState = status > 0 ? 4 : 0; + + // Determine if successful + isSuccess = status >= 200 && status < 300 || status === 304; + + // Get response data + if ( responses ) { + response = ajaxHandleResponses( s, jqXHR, responses ); + } + + // Use a noop converter for missing script + if ( !isSuccess && jQuery.inArray( "script", s.dataTypes ) > -1 ) { + s.converters[ "text script" ] = function() {}; + } + + // Convert no matter what (that way responseXXX fields are always set) + response = ajaxConvert( s, response, jqXHR, isSuccess ); + + // If successful, handle type chaining + if ( isSuccess ) { + + // Set the If-Modified-Since and/or If-None-Match header, if in ifModified mode. + if ( s.ifModified ) { + modified = jqXHR.getResponseHeader( "Last-Modified" ); + if ( modified ) { + jQuery.lastModified[ cacheURL ] = modified; + } + modified = jqXHR.getResponseHeader( "etag" ); + if ( modified ) { + jQuery.etag[ cacheURL ] = modified; + } + } + + // if no content + if ( status === 204 || s.type === "HEAD" ) { + statusText = "nocontent"; + + // if not modified + } else if ( status === 304 ) { + statusText = "notmodified"; + + // If we have data, let's convert it + } else { + statusText = response.state; + success = response.data; + error = response.error; + isSuccess = !error; + } + } else { + + // Extract error from statusText and normalize for non-aborts + error = statusText; + if ( status || !statusText ) { + statusText = "error"; + if ( status < 0 ) { + status = 0; + } + } + } + + // Set data for the fake xhr object + jqXHR.status = status; + jqXHR.statusText = ( nativeStatusText || statusText ) + ""; + + // Success/Error + if ( isSuccess ) { + deferred.resolveWith( callbackContext, [ success, statusText, jqXHR ] ); + } else { + deferred.rejectWith( callbackContext, [ jqXHR, statusText, error ] ); + } + + // Status-dependent callbacks + jqXHR.statusCode( statusCode ); + statusCode = undefined; + + if ( fireGlobals ) { + globalEventContext.trigger( isSuccess ? "ajaxSuccess" : "ajaxError", + [ jqXHR, s, isSuccess ? success : error ] ); + } + + // Complete + completeDeferred.fireWith( callbackContext, [ jqXHR, statusText ] ); + + if ( fireGlobals ) { + globalEventContext.trigger( "ajaxComplete", [ jqXHR, s ] ); + + // Handle the global AJAX counter + if ( !( --jQuery.active ) ) { + jQuery.event.trigger( "ajaxStop" ); + } + } + } + + return jqXHR; + }, + + getJSON: function( url, data, callback ) { + return jQuery.get( url, data, callback, "json" ); + }, + + getScript: function( url, callback ) { + return jQuery.get( url, undefined, callback, "script" ); + } +} ); + +jQuery.each( [ "get", "post" ], function( _i, method ) { + jQuery[ method ] = function( url, data, callback, type ) { + + // Shift arguments if data argument was omitted + if ( isFunction( data ) ) { + type = type || callback; + callback = data; + data = undefined; + } + + // The url can be an options object (which then must have .url) + return jQuery.ajax( jQuery.extend( { + url: url, + type: method, + dataType: type, + data: data, + success: callback + }, jQuery.isPlainObject( url ) && url ) ); + }; +} ); + +jQuery.ajaxPrefilter( function( s ) { + var i; + for ( i in s.headers ) { + if ( i.toLowerCase() === "content-type" ) { + s.contentType = s.headers[ i ] || ""; + } + } +} ); + + +jQuery._evalUrl = function( url, options, doc ) { + return jQuery.ajax( { + url: url, + + // Make this explicit, since user can override this through ajaxSetup (#11264) + type: "GET", + dataType: "script", + cache: true, + async: false, + global: false, + + // Only evaluate the response if it is successful (gh-4126) + // dataFilter is not invoked for failure responses, so using it instead + // of the default converter is kludgy but it works. + converters: { + "text script": function() {} + }, + dataFilter: function( response ) { + jQuery.globalEval( response, options, doc ); + } + } ); +}; + + +jQuery.fn.extend( { + wrapAll: function( html ) { + var wrap; + + if ( this[ 0 ] ) { + if ( isFunction( html ) ) { + html = html.call( this[ 0 ] ); + } + + // The elements to wrap the target around + wrap = jQuery( html, this[ 0 ].ownerDocument ).eq( 0 ).clone( true ); + + if ( this[ 0 ].parentNode ) { + wrap.insertBefore( this[ 0 ] ); + } + + wrap.map( function() { + var elem = this; + + while ( elem.firstElementChild ) { + elem = elem.firstElementChild; + } + + return elem; + } ).append( this ); + } + + return this; + }, + + wrapInner: function( html ) { + if ( isFunction( html ) ) { + return this.each( function( i ) { + jQuery( this ).wrapInner( html.call( this, i ) ); + } ); + } + + return this.each( function() { + var self = jQuery( this ), + contents = self.contents(); + + if ( contents.length ) { + contents.wrapAll( html ); + + } else { + self.append( html ); + } + } ); + }, + + wrap: function( html ) { + var htmlIsFunction = isFunction( html ); + + return this.each( function( i ) { + jQuery( this ).wrapAll( htmlIsFunction ? html.call( this, i ) : html ); + } ); + }, + + unwrap: function( selector ) { + this.parent( selector ).not( "body" ).each( function() { + jQuery( this ).replaceWith( this.childNodes ); + } ); + return this; + } +} ); + + +jQuery.expr.pseudos.hidden = function( elem ) { + return !jQuery.expr.pseudos.visible( elem ); +}; +jQuery.expr.pseudos.visible = function( elem ) { + return !!( elem.offsetWidth || elem.offsetHeight || elem.getClientRects().length ); +}; + + + + +jQuery.ajaxSettings.xhr = function() { + try { + return new window.XMLHttpRequest(); + } catch ( e ) {} +}; + +var xhrSuccessStatus = { + + // File protocol always yields status code 0, assume 200 + 0: 200, + + // Support: IE <=9 only + // #1450: sometimes IE returns 1223 when it should be 204 + 1223: 204 + }, + xhrSupported = jQuery.ajaxSettings.xhr(); + +support.cors = !!xhrSupported && ( "withCredentials" in xhrSupported ); +support.ajax = xhrSupported = !!xhrSupported; + +jQuery.ajaxTransport( function( options ) { + var callback, errorCallback; + + // Cross domain only allowed if supported through XMLHttpRequest + if ( support.cors || xhrSupported && !options.crossDomain ) { + return { + send: function( headers, complete ) { + var i, + xhr = options.xhr(); + + xhr.open( + options.type, + options.url, + options.async, + options.username, + options.password + ); + + // Apply custom fields if provided + if ( options.xhrFields ) { + for ( i in options.xhrFields ) { + xhr[ i ] = options.xhrFields[ i ]; + } + } + + // Override mime type if needed + if ( options.mimeType && xhr.overrideMimeType ) { + xhr.overrideMimeType( options.mimeType ); + } + + // X-Requested-With header + // For cross-domain requests, seeing as conditions for a preflight are + // akin to a jigsaw puzzle, we simply never set it to be sure. + // (it can always be set on a per-request basis or even using ajaxSetup) + // For same-domain requests, won't change header if already provided. + if ( !options.crossDomain && !headers[ "X-Requested-With" ] ) { + headers[ "X-Requested-With" ] = "XMLHttpRequest"; + } + + // Set headers + for ( i in headers ) { + xhr.setRequestHeader( i, headers[ i ] ); + } + + // Callback + callback = function( type ) { + return function() { + if ( callback ) { + callback = errorCallback = xhr.onload = + xhr.onerror = xhr.onabort = xhr.ontimeout = + xhr.onreadystatechange = null; + + if ( type === "abort" ) { + xhr.abort(); + } else if ( type === "error" ) { + + // Support: IE <=9 only + // On a manual native abort, IE9 throws + // errors on any property access that is not readyState + if ( typeof xhr.status !== "number" ) { + complete( 0, "error" ); + } else { + complete( + + // File: protocol always yields status 0; see #8605, #14207 + xhr.status, + xhr.statusText + ); + } + } else { + complete( + xhrSuccessStatus[ xhr.status ] || xhr.status, + xhr.statusText, + + // Support: IE <=9 only + // IE9 has no XHR2 but throws on binary (trac-11426) + // For XHR2 non-text, let the caller handle it (gh-2498) + ( xhr.responseType || "text" ) !== "text" || + typeof xhr.responseText !== "string" ? + { binary: xhr.response } : + { text: xhr.responseText }, + xhr.getAllResponseHeaders() + ); + } + } + }; + }; + + // Listen to events + xhr.onload = callback(); + errorCallback = xhr.onerror = xhr.ontimeout = callback( "error" ); + + // Support: IE 9 only + // Use onreadystatechange to replace onabort + // to handle uncaught aborts + if ( xhr.onabort !== undefined ) { + xhr.onabort = errorCallback; + } else { + xhr.onreadystatechange = function() { + + // Check readyState before timeout as it changes + if ( xhr.readyState === 4 ) { + + // Allow onerror to be called first, + // but that will not handle a native abort + // Also, save errorCallback to a variable + // as xhr.onerror cannot be accessed + window.setTimeout( function() { + if ( callback ) { + errorCallback(); + } + } ); + } + }; + } + + // Create the abort callback + callback = callback( "abort" ); + + try { + + // Do send the request (this may raise an exception) + xhr.send( options.hasContent && options.data || null ); + } catch ( e ) { + + // #14683: Only rethrow if this hasn't been notified as an error yet + if ( callback ) { + throw e; + } + } + }, + + abort: function() { + if ( callback ) { + callback(); + } + } + }; + } +} ); + + + + +// Prevent auto-execution of scripts when no explicit dataType was provided (See gh-2432) +jQuery.ajaxPrefilter( function( s ) { + if ( s.crossDomain ) { + s.contents.script = false; + } +} ); + +// Install script dataType +jQuery.ajaxSetup( { + accepts: { + script: "text/javascript, application/javascript, " + + "application/ecmascript, application/x-ecmascript" + }, + contents: { + script: /\b(?:java|ecma)script\b/ + }, + converters: { + "text script": function( text ) { + jQuery.globalEval( text ); + return text; + } + } +} ); + +// Handle cache's special case and crossDomain +jQuery.ajaxPrefilter( "script", function( s ) { + if ( s.cache === undefined ) { + s.cache = false; + } + if ( s.crossDomain ) { + s.type = "GET"; + } +} ); + +// Bind script tag hack transport +jQuery.ajaxTransport( "script", function( s ) { + + // This transport only deals with cross domain or forced-by-attrs requests + if ( s.crossDomain || s.scriptAttrs ) { + var script, callback; + return { + send: function( _, complete ) { + script = jQuery( "\r\n"; + +// inject VBScript +document.write(IEBinaryToArray_ByteStr_Script); + +global.JSZipUtils._getBinaryFromXHR = function (xhr) { + var binary = xhr.responseBody; + var byteMapping = {}; + for ( var i = 0; i < 256; i++ ) { + for ( var j = 0; j < 256; j++ ) { + byteMapping[ String.fromCharCode( i + (j << 8) ) ] = + String.fromCharCode(i) + String.fromCharCode(j); + } + } + var rawBytes = IEBinaryToArray_ByteStr(binary); + var lastChr = IEBinaryToArray_ByteStr_Last(binary); + return rawBytes.replace(/[\s\S]/g, function( match ) { + return byteMapping[match]; + }) + lastChr; +}; + +// enforcing Stuk's coding style +// vim: set shiftwidth=4 softtabstop=4: + +},{}]},{},[1]) +; diff --git a/31/javadoc/jquery/jszip-utils/dist/jszip-utils-ie.min.js b/31/javadoc/jquery/jszip-utils/dist/jszip-utils-ie.min.js new file mode 100644 index 000000000..93d8bc8ef --- /dev/null +++ b/31/javadoc/jquery/jszip-utils/dist/jszip-utils-ie.min.js @@ -0,0 +1,10 @@ +/*! + +JSZipUtils - A collection of cross-browser utilities to go along with JSZip. + + +(c) 2014 Stuart Knightley, David Duponchel +Dual licenced under the MIT license or GPLv3. See https://raw.github.com/Stuk/jszip-utils/master/LICENSE.markdown. + +*/ +!function a(b,c,d){function e(g,h){if(!c[g]){if(!b[g]){var i="function"==typeof require&&require;if(!h&&i)return i(g,!0);if(f)return f(g,!0);throw new Error("Cannot find module '"+g+"'")}var j=c[g]={exports:{}};b[g][0].call(j.exports,function(a){var c=b[g][1][a];return e(c?c:a)},j,j.exports,a,b,c,d)}return c[g].exports}for(var f="function"==typeof require&&require,g=0;g\r\n";document.write(b),a.JSZipUtils._getBinaryFromXHR=function(a){for(var b=a.responseBody,c={},d=0;256>d;d++)for(var e=0;256>e;e++)c[String.fromCharCode(d+(e<<8))]=String.fromCharCode(d)+String.fromCharCode(e);var f=IEBinaryToArray_ByteStr(b),g=IEBinaryToArray_ByteStr_Last(b);return f.replace(/[\s\S]/g,function(a){return c[a]})+g}},{}]},{},[1]); diff --git a/31/javadoc/jquery/jszip-utils/dist/jszip-utils.js b/31/javadoc/jquery/jszip-utils/dist/jszip-utils.js new file mode 100644 index 000000000..775895ec9 --- /dev/null +++ b/31/javadoc/jquery/jszip-utils/dist/jszip-utils.js @@ -0,0 +1,118 @@ +/*! + +JSZipUtils - A collection of cross-browser utilities to go along with JSZip. + + +(c) 2014 Stuart Knightley, David Duponchel +Dual licenced under the MIT license or GPLv3. See https://raw.github.com/Stuk/jszip-utils/master/LICENSE.markdown. + +*/ +!function(e){"object"==typeof exports?module.exports=e():"function"==typeof define&&define.amd?define(e):"undefined"!=typeof window?window.JSZipUtils=e():"undefined"!=typeof global?global.JSZipUtils=e():"undefined"!=typeof self&&(self.JSZipUtils=e())}(function(){var define,module,exports;return (function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof require=="function"&&require;if(!u&&a)return a(o,!0);if(i)return i(o,!0);throw new Error("Cannot find module '"+o+"'")}var f=n[o]={exports:{}};t[o][0].call(f.exports,function(e){var n=t[o][1][e];return s(n?n:e)},f,f.exports,e,t,n,r)}return n[o].exports}var i=typeof require=="function"&&require;for(var o=0;o + +(c) 2014 Stuart Knightley, David Duponchel +Dual licenced under the MIT license or GPLv3. See https://raw.github.com/Stuk/jszip-utils/master/LICENSE.markdown. + +*/ +!function(a){"object"==typeof exports?module.exports=a():"function"==typeof define&&define.amd?define(a):"undefined"!=typeof window?window.JSZipUtils=a():"undefined"!=typeof global?global.JSZipUtils=a():"undefined"!=typeof self&&(self.JSZipUtils=a())}(function(){return function a(b,c,d){function e(g,h){if(!c[g]){if(!b[g]){var i="function"==typeof require&&require;if(!h&&i)return i(g,!0);if(f)return f(g,!0);throw new Error("Cannot find module '"+g+"'")}var j=c[g]={exports:{}};b[g][0].call(j.exports,function(a){var c=b[g][1][a];return e(c?c:a)},j,j.exports,a,b,c,d)}return c[g].exports}for(var f="function"==typeof require&&require,g=0;g + +(c) 2009-2016 Stuart Knightley +Dual licenced under the MIT license or GPLv3. See https://raw.github.com/Stuk/jszip/master/LICENSE.markdown. + +JSZip uses the library pako released under the MIT license : +https://github.com/nodeca/pako/blob/master/LICENSE +*/ + +(function(f){if(typeof exports==="object"&&typeof module!=="undefined"){module.exports=f()}else if(typeof define==="function"&&define.amd){define([],f)}else{var g;if(typeof window!=="undefined"){g=window}else if(typeof global!=="undefined"){g=global}else if(typeof self!=="undefined"){g=self}else{g=this}g.JSZip = f()}})(function(){var define,module,exports;return (function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof require=="function"&&require;if(!u&&a)return a(o,!0);if(i)return i(o,!0);var f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof require=="function"&&require;for(var o=0;o> 2; + enc2 = ((chr1 & 3) << 4) | (chr2 >> 4); + enc3 = remainingBytes > 1 ? (((chr2 & 15) << 2) | (chr3 >> 6)) : 64; + enc4 = remainingBytes > 2 ? (chr3 & 63) : 64; + + output.push(_keyStr.charAt(enc1) + _keyStr.charAt(enc2) + _keyStr.charAt(enc3) + _keyStr.charAt(enc4)); + + } + + return output.join(""); +}; + +// public method for decoding +exports.decode = function(input) { + var chr1, chr2, chr3; + var enc1, enc2, enc3, enc4; + var i = 0, resultIndex = 0; + + var dataUrlPrefix = "data:"; + + if (input.substr(0, dataUrlPrefix.length) === dataUrlPrefix) { + // This is a common error: people give a data url + // (data:image/png;base64,iVBOR...) with a {base64: true} and + // wonders why things don't work. + // We can detect that the string input looks like a data url but we + // *can't* be sure it is one: removing everything up to the comma would + // be too dangerous. + throw new Error("Invalid base64 input, it looks like a data url."); + } + + input = input.replace(/[^A-Za-z0-9\+\/\=]/g, ""); + + var totalLength = input.length * 3 / 4; + if(input.charAt(input.length - 1) === _keyStr.charAt(64)) { + totalLength--; + } + if(input.charAt(input.length - 2) === _keyStr.charAt(64)) { + totalLength--; + } + if (totalLength % 1 !== 0) { + // totalLength is not an integer, the length does not match a valid + // base64 content. That can happen if: + // - the input is not a base64 content + // - the input is *almost* a base64 content, with a extra chars at the + // beginning or at the end + // - the input uses a base64 variant (base64url for example) + throw new Error("Invalid base64 input, bad content length."); + } + var output; + if (support.uint8array) { + output = new Uint8Array(totalLength|0); + } else { + output = new Array(totalLength|0); + } + + while (i < input.length) { + + enc1 = _keyStr.indexOf(input.charAt(i++)); + enc2 = _keyStr.indexOf(input.charAt(i++)); + enc3 = _keyStr.indexOf(input.charAt(i++)); + enc4 = _keyStr.indexOf(input.charAt(i++)); + + chr1 = (enc1 << 2) | (enc2 >> 4); + chr2 = ((enc2 & 15) << 4) | (enc3 >> 2); + chr3 = ((enc3 & 3) << 6) | enc4; + + output[resultIndex++] = chr1; + + if (enc3 !== 64) { + output[resultIndex++] = chr2; + } + if (enc4 !== 64) { + output[resultIndex++] = chr3; + } + + } + + return output; +}; + +},{"./support":30,"./utils":32}],2:[function(require,module,exports){ +'use strict'; + +var external = require("./external"); +var DataWorker = require('./stream/DataWorker'); +var DataLengthProbe = require('./stream/DataLengthProbe'); +var Crc32Probe = require('./stream/Crc32Probe'); +var DataLengthProbe = require('./stream/DataLengthProbe'); + +/** + * Represent a compressed object, with everything needed to decompress it. + * @constructor + * @param {number} compressedSize the size of the data compressed. + * @param {number} uncompressedSize the size of the data after decompression. + * @param {number} crc32 the crc32 of the decompressed file. + * @param {object} compression the type of compression, see lib/compressions.js. + * @param {String|ArrayBuffer|Uint8Array|Buffer} data the compressed data. + */ +function CompressedObject(compressedSize, uncompressedSize, crc32, compression, data) { + this.compressedSize = compressedSize; + this.uncompressedSize = uncompressedSize; + this.crc32 = crc32; + this.compression = compression; + this.compressedContent = data; +} + +CompressedObject.prototype = { + /** + * Create a worker to get the uncompressed content. + * @return {GenericWorker} the worker. + */ + getContentWorker : function () { + var worker = new DataWorker(external.Promise.resolve(this.compressedContent)) + .pipe(this.compression.uncompressWorker()) + .pipe(new DataLengthProbe("data_length")); + + var that = this; + worker.on("end", function () { + if(this.streamInfo['data_length'] !== that.uncompressedSize) { + throw new Error("Bug : uncompressed data size mismatch"); + } + }); + return worker; + }, + /** + * Create a worker to get the compressed content. + * @return {GenericWorker} the worker. + */ + getCompressedWorker : function () { + return new DataWorker(external.Promise.resolve(this.compressedContent)) + .withStreamInfo("compressedSize", this.compressedSize) + .withStreamInfo("uncompressedSize", this.uncompressedSize) + .withStreamInfo("crc32", this.crc32) + .withStreamInfo("compression", this.compression) + ; + } +}; + +/** + * Chain the given worker with other workers to compress the content with the + * given compresion. + * @param {GenericWorker} uncompressedWorker the worker to pipe. + * @param {Object} compression the compression object. + * @param {Object} compressionOptions the options to use when compressing. + * @return {GenericWorker} the new worker compressing the content. + */ +CompressedObject.createWorkerFrom = function (uncompressedWorker, compression, compressionOptions) { + return uncompressedWorker + .pipe(new Crc32Probe()) + .pipe(new DataLengthProbe("uncompressedSize")) + .pipe(compression.compressWorker(compressionOptions)) + .pipe(new DataLengthProbe("compressedSize")) + .withStreamInfo("compression", compression); +}; + +module.exports = CompressedObject; + +},{"./external":6,"./stream/Crc32Probe":25,"./stream/DataLengthProbe":26,"./stream/DataWorker":27}],3:[function(require,module,exports){ +'use strict'; + +var GenericWorker = require("./stream/GenericWorker"); + +exports.STORE = { + magic: "\x00\x00", + compressWorker : function (compressionOptions) { + return new GenericWorker("STORE compression"); + }, + uncompressWorker : function () { + return new GenericWorker("STORE decompression"); + } +}; +exports.DEFLATE = require('./flate'); + +},{"./flate":7,"./stream/GenericWorker":28}],4:[function(require,module,exports){ +'use strict'; + +var utils = require('./utils'); + +/** + * The following functions come from pako, from pako/lib/zlib/crc32.js + * released under the MIT license, see pako https://github.com/nodeca/pako/ + */ + +// Use ordinary array, since untyped makes no boost here +function makeTable() { + var c, table = []; + + for(var n =0; n < 256; n++){ + c = n; + for(var k =0; k < 8; k++){ + c = ((c&1) ? (0xEDB88320 ^ (c >>> 1)) : (c >>> 1)); + } + table[n] = c; + } + + return table; +} + +// Create table on load. Just 255 signed longs. Not a problem. +var crcTable = makeTable(); + + +function crc32(crc, buf, len, pos) { + var t = crcTable, end = pos + len; + + crc = crc ^ (-1); + + for (var i = pos; i < end; i++ ) { + crc = (crc >>> 8) ^ t[(crc ^ buf[i]) & 0xFF]; + } + + return (crc ^ (-1)); // >>> 0; +} + +// That's all for the pako functions. + +/** + * Compute the crc32 of a string. + * This is almost the same as the function crc32, but for strings. Using the + * same function for the two use cases leads to horrible performances. + * @param {Number} crc the starting value of the crc. + * @param {String} str the string to use. + * @param {Number} len the length of the string. + * @param {Number} pos the starting position for the crc32 computation. + * @return {Number} the computed crc32. + */ +function crc32str(crc, str, len, pos) { + var t = crcTable, end = pos + len; + + crc = crc ^ (-1); + + for (var i = pos; i < end; i++ ) { + crc = (crc >>> 8) ^ t[(crc ^ str.charCodeAt(i)) & 0xFF]; + } + + return (crc ^ (-1)); // >>> 0; +} + +module.exports = function crc32wrapper(input, crc) { + if (typeof input === "undefined" || !input.length) { + return 0; + } + + var isArray = utils.getTypeOf(input) !== "string"; + + if(isArray) { + return crc32(crc|0, input, input.length, 0); + } else { + return crc32str(crc|0, input, input.length, 0); + } +}; + +},{"./utils":32}],5:[function(require,module,exports){ +'use strict'; +exports.base64 = false; +exports.binary = false; +exports.dir = false; +exports.createFolders = true; +exports.date = null; +exports.compression = null; +exports.compressionOptions = null; +exports.comment = null; +exports.unixPermissions = null; +exports.dosPermissions = null; + +},{}],6:[function(require,module,exports){ +/* global Promise */ +'use strict'; + +// load the global object first: +// - it should be better integrated in the system (unhandledRejection in node) +// - the environment may have a custom Promise implementation (see zone.js) +var ES6Promise = null; +if (typeof Promise !== "undefined") { + ES6Promise = Promise; +} else { + ES6Promise = require("lie"); +} + +/** + * Let the user use/change some implementations. + */ +module.exports = { + Promise: ES6Promise +}; + +},{"lie":37}],7:[function(require,module,exports){ +'use strict'; +var USE_TYPEDARRAY = (typeof Uint8Array !== 'undefined') && (typeof Uint16Array !== 'undefined') && (typeof Uint32Array !== 'undefined'); + +var pako = require("pako"); +var utils = require("./utils"); +var GenericWorker = require("./stream/GenericWorker"); + +var ARRAY_TYPE = USE_TYPEDARRAY ? "uint8array" : "array"; + +exports.magic = "\x08\x00"; + +/** + * Create a worker that uses pako to inflate/deflate. + * @constructor + * @param {String} action the name of the pako function to call : either "Deflate" or "Inflate". + * @param {Object} options the options to use when (de)compressing. + */ +function FlateWorker(action, options) { + GenericWorker.call(this, "FlateWorker/" + action); + + this._pako = null; + this._pakoAction = action; + this._pakoOptions = options; + // the `meta` object from the last chunk received + // this allow this worker to pass around metadata + this.meta = {}; +} + +utils.inherits(FlateWorker, GenericWorker); + +/** + * @see GenericWorker.processChunk + */ +FlateWorker.prototype.processChunk = function (chunk) { + this.meta = chunk.meta; + if (this._pako === null) { + this._createPako(); + } + this._pako.push(utils.transformTo(ARRAY_TYPE, chunk.data), false); +}; + +/** + * @see GenericWorker.flush + */ +FlateWorker.prototype.flush = function () { + GenericWorker.prototype.flush.call(this); + if (this._pako === null) { + this._createPako(); + } + this._pako.push([], true); +}; +/** + * @see GenericWorker.cleanUp + */ +FlateWorker.prototype.cleanUp = function () { + GenericWorker.prototype.cleanUp.call(this); + this._pako = null; +}; + +/** + * Create the _pako object. + * TODO: lazy-loading this object isn't the best solution but it's the + * quickest. The best solution is to lazy-load the worker list. See also the + * issue #446. + */ +FlateWorker.prototype._createPako = function () { + this._pako = new pako[this._pakoAction]({ + raw: true, + level: this._pakoOptions.level || -1 // default compression + }); + var self = this; + this._pako.onData = function(data) { + self.push({ + data : data, + meta : self.meta + }); + }; +}; + +exports.compressWorker = function (compressionOptions) { + return new FlateWorker("Deflate", compressionOptions); +}; +exports.uncompressWorker = function () { + return new FlateWorker("Inflate", {}); +}; + +},{"./stream/GenericWorker":28,"./utils":32,"pako":38}],8:[function(require,module,exports){ +'use strict'; + +var utils = require('../utils'); +var GenericWorker = require('../stream/GenericWorker'); +var utf8 = require('../utf8'); +var crc32 = require('../crc32'); +var signature = require('../signature'); + +/** + * Transform an integer into a string in hexadecimal. + * @private + * @param {number} dec the number to convert. + * @param {number} bytes the number of bytes to generate. + * @returns {string} the result. + */ +var decToHex = function(dec, bytes) { + var hex = "", i; + for (i = 0; i < bytes; i++) { + hex += String.fromCharCode(dec & 0xff); + dec = dec >>> 8; + } + return hex; +}; + +/** + * Generate the UNIX part of the external file attributes. + * @param {Object} unixPermissions the unix permissions or null. + * @param {Boolean} isDir true if the entry is a directory, false otherwise. + * @return {Number} a 32 bit integer. + * + * adapted from http://unix.stackexchange.com/questions/14705/the-zip-formats-external-file-attribute : + * + * TTTTsstrwxrwxrwx0000000000ADVSHR + * ^^^^____________________________ file type, see zipinfo.c (UNX_*) + * ^^^_________________________ setuid, setgid, sticky + * ^^^^^^^^^________________ permissions + * ^^^^^^^^^^______ not used ? + * ^^^^^^ DOS attribute bits : Archive, Directory, Volume label, System file, Hidden, Read only + */ +var generateUnixExternalFileAttr = function (unixPermissions, isDir) { + + var result = unixPermissions; + if (!unixPermissions) { + // I can't use octal values in strict mode, hence the hexa. + // 040775 => 0x41fd + // 0100664 => 0x81b4 + result = isDir ? 0x41fd : 0x81b4; + } + return (result & 0xFFFF) << 16; +}; + +/** + * Generate the DOS part of the external file attributes. + * @param {Object} dosPermissions the dos permissions or null. + * @param {Boolean} isDir true if the entry is a directory, false otherwise. + * @return {Number} a 32 bit integer. + * + * Bit 0 Read-Only + * Bit 1 Hidden + * Bit 2 System + * Bit 3 Volume Label + * Bit 4 Directory + * Bit 5 Archive + */ +var generateDosExternalFileAttr = function (dosPermissions, isDir) { + + // the dir flag is already set for compatibility + return (dosPermissions || 0) & 0x3F; +}; + +/** + * Generate the various parts used in the construction of the final zip file. + * @param {Object} streamInfo the hash with informations about the compressed file. + * @param {Boolean} streamedContent is the content streamed ? + * @param {Boolean} streamingEnded is the stream finished ? + * @param {number} offset the current offset from the start of the zip file. + * @param {String} platform let's pretend we are this platform (change platform dependents fields) + * @param {Function} encodeFileName the function to encode the file name / comment. + * @return {Object} the zip parts. + */ +var generateZipParts = function(streamInfo, streamedContent, streamingEnded, offset, platform, encodeFileName) { + var file = streamInfo['file'], + compression = streamInfo['compression'], + useCustomEncoding = encodeFileName !== utf8.utf8encode, + encodedFileName = utils.transformTo("string", encodeFileName(file.name)), + utfEncodedFileName = utils.transformTo("string", utf8.utf8encode(file.name)), + comment = file.comment, + encodedComment = utils.transformTo("string", encodeFileName(comment)), + utfEncodedComment = utils.transformTo("string", utf8.utf8encode(comment)), + useUTF8ForFileName = utfEncodedFileName.length !== file.name.length, + useUTF8ForComment = utfEncodedComment.length !== comment.length, + dosTime, + dosDate, + extraFields = "", + unicodePathExtraField = "", + unicodeCommentExtraField = "", + dir = file.dir, + date = file.date; + + + var dataInfo = { + crc32 : 0, + compressedSize : 0, + uncompressedSize : 0 + }; + + // if the content is streamed, the sizes/crc32 are only available AFTER + // the end of the stream. + if (!streamedContent || streamingEnded) { + dataInfo.crc32 = streamInfo['crc32']; + dataInfo.compressedSize = streamInfo['compressedSize']; + dataInfo.uncompressedSize = streamInfo['uncompressedSize']; + } + + var bitflag = 0; + if (streamedContent) { + // Bit 3: the sizes/crc32 are set to zero in the local header. + // The correct values are put in the data descriptor immediately + // following the compressed data. + bitflag |= 0x0008; + } + if (!useCustomEncoding && (useUTF8ForFileName || useUTF8ForComment)) { + // Bit 11: Language encoding flag (EFS). + bitflag |= 0x0800; + } + + + var extFileAttr = 0; + var versionMadeBy = 0; + if (dir) { + // dos or unix, we set the dos dir flag + extFileAttr |= 0x00010; + } + if(platform === "UNIX") { + versionMadeBy = 0x031E; // UNIX, version 3.0 + extFileAttr |= generateUnixExternalFileAttr(file.unixPermissions, dir); + } else { // DOS or other, fallback to DOS + versionMadeBy = 0x0014; // DOS, version 2.0 + extFileAttr |= generateDosExternalFileAttr(file.dosPermissions, dir); + } + + // date + // @see http://www.delorie.com/djgpp/doc/rbinter/it/52/13.html + // @see http://www.delorie.com/djgpp/doc/rbinter/it/65/16.html + // @see http://www.delorie.com/djgpp/doc/rbinter/it/66/16.html + + dosTime = date.getUTCHours(); + dosTime = dosTime << 6; + dosTime = dosTime | date.getUTCMinutes(); + dosTime = dosTime << 5; + dosTime = dosTime | date.getUTCSeconds() / 2; + + dosDate = date.getUTCFullYear() - 1980; + dosDate = dosDate << 4; + dosDate = dosDate | (date.getUTCMonth() + 1); + dosDate = dosDate << 5; + dosDate = dosDate | date.getUTCDate(); + + if (useUTF8ForFileName) { + // set the unicode path extra field. unzip needs at least one extra + // field to correctly handle unicode path, so using the path is as good + // as any other information. This could improve the situation with + // other archive managers too. + // This field is usually used without the utf8 flag, with a non + // unicode path in the header (winrar, winzip). This helps (a bit) + // with the messy Windows' default compressed folders feature but + // breaks on p7zip which doesn't seek the unicode path extra field. + // So for now, UTF-8 everywhere ! + unicodePathExtraField = + // Version + decToHex(1, 1) + + // NameCRC32 + decToHex(crc32(encodedFileName), 4) + + // UnicodeName + utfEncodedFileName; + + extraFields += + // Info-ZIP Unicode Path Extra Field + "\x75\x70" + + // size + decToHex(unicodePathExtraField.length, 2) + + // content + unicodePathExtraField; + } + + if(useUTF8ForComment) { + + unicodeCommentExtraField = + // Version + decToHex(1, 1) + + // CommentCRC32 + decToHex(crc32(encodedComment), 4) + + // UnicodeName + utfEncodedComment; + + extraFields += + // Info-ZIP Unicode Path Extra Field + "\x75\x63" + + // size + decToHex(unicodeCommentExtraField.length, 2) + + // content + unicodeCommentExtraField; + } + + var header = ""; + + // version needed to extract + header += "\x0A\x00"; + // general purpose bit flag + header += decToHex(bitflag, 2); + // compression method + header += compression.magic; + // last mod file time + header += decToHex(dosTime, 2); + // last mod file date + header += decToHex(dosDate, 2); + // crc-32 + header += decToHex(dataInfo.crc32, 4); + // compressed size + header += decToHex(dataInfo.compressedSize, 4); + // uncompressed size + header += decToHex(dataInfo.uncompressedSize, 4); + // file name length + header += decToHex(encodedFileName.length, 2); + // extra field length + header += decToHex(extraFields.length, 2); + + + var fileRecord = signature.LOCAL_FILE_HEADER + header + encodedFileName + extraFields; + + var dirRecord = signature.CENTRAL_FILE_HEADER + + // version made by (00: DOS) + decToHex(versionMadeBy, 2) + + // file header (common to file and central directory) + header + + // file comment length + decToHex(encodedComment.length, 2) + + // disk number start + "\x00\x00" + + // internal file attributes TODO + "\x00\x00" + + // external file attributes + decToHex(extFileAttr, 4) + + // relative offset of local header + decToHex(offset, 4) + + // file name + encodedFileName + + // extra field + extraFields + + // file comment + encodedComment; + + return { + fileRecord: fileRecord, + dirRecord: dirRecord + }; +}; + +/** + * Generate the EOCD record. + * @param {Number} entriesCount the number of entries in the zip file. + * @param {Number} centralDirLength the length (in bytes) of the central dir. + * @param {Number} localDirLength the length (in bytes) of the local dir. + * @param {String} comment the zip file comment as a binary string. + * @param {Function} encodeFileName the function to encode the comment. + * @return {String} the EOCD record. + */ +var generateCentralDirectoryEnd = function (entriesCount, centralDirLength, localDirLength, comment, encodeFileName) { + var dirEnd = ""; + var encodedComment = utils.transformTo("string", encodeFileName(comment)); + + // end of central dir signature + dirEnd = signature.CENTRAL_DIRECTORY_END + + // number of this disk + "\x00\x00" + + // number of the disk with the start of the central directory + "\x00\x00" + + // total number of entries in the central directory on this disk + decToHex(entriesCount, 2) + + // total number of entries in the central directory + decToHex(entriesCount, 2) + + // size of the central directory 4 bytes + decToHex(centralDirLength, 4) + + // offset of start of central directory with respect to the starting disk number + decToHex(localDirLength, 4) + + // .ZIP file comment length + decToHex(encodedComment.length, 2) + + // .ZIP file comment + encodedComment; + + return dirEnd; +}; + +/** + * Generate data descriptors for a file entry. + * @param {Object} streamInfo the hash generated by a worker, containing informations + * on the file entry. + * @return {String} the data descriptors. + */ +var generateDataDescriptors = function (streamInfo) { + var descriptor = ""; + descriptor = signature.DATA_DESCRIPTOR + + // crc-32 4 bytes + decToHex(streamInfo['crc32'], 4) + + // compressed size 4 bytes + decToHex(streamInfo['compressedSize'], 4) + + // uncompressed size 4 bytes + decToHex(streamInfo['uncompressedSize'], 4); + + return descriptor; +}; + + +/** + * A worker to concatenate other workers to create a zip file. + * @param {Boolean} streamFiles `true` to stream the content of the files, + * `false` to accumulate it. + * @param {String} comment the comment to use. + * @param {String} platform the platform to use, "UNIX" or "DOS". + * @param {Function} encodeFileName the function to encode file names and comments. + */ +function ZipFileWorker(streamFiles, comment, platform, encodeFileName) { + GenericWorker.call(this, "ZipFileWorker"); + // The number of bytes written so far. This doesn't count accumulated chunks. + this.bytesWritten = 0; + // The comment of the zip file + this.zipComment = comment; + // The platform "generating" the zip file. + this.zipPlatform = platform; + // the function to encode file names and comments. + this.encodeFileName = encodeFileName; + // Should we stream the content of the files ? + this.streamFiles = streamFiles; + // If `streamFiles` is false, we will need to accumulate the content of the + // files to calculate sizes / crc32 (and write them *before* the content). + // This boolean indicates if we are accumulating chunks (it will change a lot + // during the lifetime of this worker). + this.accumulate = false; + // The buffer receiving chunks when accumulating content. + this.contentBuffer = []; + // The list of generated directory records. + this.dirRecords = []; + // The offset (in bytes) from the beginning of the zip file for the current source. + this.currentSourceOffset = 0; + // The total number of entries in this zip file. + this.entriesCount = 0; + // the name of the file currently being added, null when handling the end of the zip file. + // Used for the emited metadata. + this.currentFile = null; + + + + this._sources = []; +} +utils.inherits(ZipFileWorker, GenericWorker); + +/** + * @see GenericWorker.push + */ +ZipFileWorker.prototype.push = function (chunk) { + + var currentFilePercent = chunk.meta.percent || 0; + var entriesCount = this.entriesCount; + var remainingFiles = this._sources.length; + + if(this.accumulate) { + this.contentBuffer.push(chunk); + } else { + this.bytesWritten += chunk.data.length; + + GenericWorker.prototype.push.call(this, { + data : chunk.data, + meta : { + currentFile : this.currentFile, + percent : entriesCount ? (currentFilePercent + 100 * (entriesCount - remainingFiles - 1)) / entriesCount : 100 + } + }); + } +}; + +/** + * The worker started a new source (an other worker). + * @param {Object} streamInfo the streamInfo object from the new source. + */ +ZipFileWorker.prototype.openedSource = function (streamInfo) { + this.currentSourceOffset = this.bytesWritten; + this.currentFile = streamInfo['file'].name; + + var streamedContent = this.streamFiles && !streamInfo['file'].dir; + + // don't stream folders (because they don't have any content) + if(streamedContent) { + var record = generateZipParts(streamInfo, streamedContent, false, this.currentSourceOffset, this.zipPlatform, this.encodeFileName); + this.push({ + data : record.fileRecord, + meta : {percent:0} + }); + } else { + // we need to wait for the whole file before pushing anything + this.accumulate = true; + } +}; + +/** + * The worker finished a source (an other worker). + * @param {Object} streamInfo the streamInfo object from the finished source. + */ +ZipFileWorker.prototype.closedSource = function (streamInfo) { + this.accumulate = false; + var streamedContent = this.streamFiles && !streamInfo['file'].dir; + var record = generateZipParts(streamInfo, streamedContent, true, this.currentSourceOffset, this.zipPlatform, this.encodeFileName); + + this.dirRecords.push(record.dirRecord); + if(streamedContent) { + // after the streamed file, we put data descriptors + this.push({ + data : generateDataDescriptors(streamInfo), + meta : {percent:100} + }); + } else { + // the content wasn't streamed, we need to push everything now + // first the file record, then the content + this.push({ + data : record.fileRecord, + meta : {percent:0} + }); + while(this.contentBuffer.length) { + this.push(this.contentBuffer.shift()); + } + } + this.currentFile = null; +}; + +/** + * @see GenericWorker.flush + */ +ZipFileWorker.prototype.flush = function () { + + var localDirLength = this.bytesWritten; + for(var i = 0; i < this.dirRecords.length; i++) { + this.push({ + data : this.dirRecords[i], + meta : {percent:100} + }); + } + var centralDirLength = this.bytesWritten - localDirLength; + + var dirEnd = generateCentralDirectoryEnd(this.dirRecords.length, centralDirLength, localDirLength, this.zipComment, this.encodeFileName); + + this.push({ + data : dirEnd, + meta : {percent:100} + }); +}; + +/** + * Prepare the next source to be read. + */ +ZipFileWorker.prototype.prepareNextSource = function () { + this.previous = this._sources.shift(); + this.openedSource(this.previous.streamInfo); + if (this.isPaused) { + this.previous.pause(); + } else { + this.previous.resume(); + } +}; + +/** + * @see GenericWorker.registerPrevious + */ +ZipFileWorker.prototype.registerPrevious = function (previous) { + this._sources.push(previous); + var self = this; + + previous.on('data', function (chunk) { + self.processChunk(chunk); + }); + previous.on('end', function () { + self.closedSource(self.previous.streamInfo); + if(self._sources.length) { + self.prepareNextSource(); + } else { + self.end(); + } + }); + previous.on('error', function (e) { + self.error(e); + }); + return this; +}; + +/** + * @see GenericWorker.resume + */ +ZipFileWorker.prototype.resume = function () { + if(!GenericWorker.prototype.resume.call(this)) { + return false; + } + + if (!this.previous && this._sources.length) { + this.prepareNextSource(); + return true; + } + if (!this.previous && !this._sources.length && !this.generatedError) { + this.end(); + return true; + } +}; + +/** + * @see GenericWorker.error + */ +ZipFileWorker.prototype.error = function (e) { + var sources = this._sources; + if(!GenericWorker.prototype.error.call(this, e)) { + return false; + } + for(var i = 0; i < sources.length; i++) { + try { + sources[i].error(e); + } catch(e) { + // the `error` exploded, nothing to do + } + } + return true; +}; + +/** + * @see GenericWorker.lock + */ +ZipFileWorker.prototype.lock = function () { + GenericWorker.prototype.lock.call(this); + var sources = this._sources; + for(var i = 0; i < sources.length; i++) { + sources[i].lock(); + } +}; + +module.exports = ZipFileWorker; + +},{"../crc32":4,"../signature":23,"../stream/GenericWorker":28,"../utf8":31,"../utils":32}],9:[function(require,module,exports){ +'use strict'; + +var compressions = require('../compressions'); +var ZipFileWorker = require('./ZipFileWorker'); + +/** + * Find the compression to use. + * @param {String} fileCompression the compression defined at the file level, if any. + * @param {String} zipCompression the compression defined at the load() level. + * @return {Object} the compression object to use. + */ +var getCompression = function (fileCompression, zipCompression) { + + var compressionName = fileCompression || zipCompression; + var compression = compressions[compressionName]; + if (!compression) { + throw new Error(compressionName + " is not a valid compression method !"); + } + return compression; +}; + +/** + * Create a worker to generate a zip file. + * @param {JSZip} zip the JSZip instance at the right root level. + * @param {Object} options to generate the zip file. + * @param {String} comment the comment to use. + */ +exports.generateWorker = function (zip, options, comment) { + + var zipFileWorker = new ZipFileWorker(options.streamFiles, comment, options.platform, options.encodeFileName); + var entriesCount = 0; + try { + + zip.forEach(function (relativePath, file) { + entriesCount++; + var compression = getCompression(file.options.compression, options.compression); + var compressionOptions = file.options.compressionOptions || options.compressionOptions || {}; + var dir = file.dir, date = file.date; + + file._compressWorker(compression, compressionOptions) + .withStreamInfo("file", { + name : relativePath, + dir : dir, + date : date, + comment : file.comment || "", + unixPermissions : file.unixPermissions, + dosPermissions : file.dosPermissions + }) + .pipe(zipFileWorker); + }); + zipFileWorker.entriesCount = entriesCount; + } catch (e) { + zipFileWorker.error(e); + } + + return zipFileWorker; +}; + +},{"../compressions":3,"./ZipFileWorker":8}],10:[function(require,module,exports){ +'use strict'; + +/** + * Representation a of zip file in js + * @constructor + */ +function JSZip() { + // if this constructor is used without `new`, it adds `new` before itself: + if(!(this instanceof JSZip)) { + return new JSZip(); + } + + if(arguments.length) { + throw new Error("The constructor with parameters has been removed in JSZip 3.0, please check the upgrade guide."); + } + + // object containing the files : + // { + // "folder/" : {...}, + // "folder/data.txt" : {...} + // } + this.files = {}; + + this.comment = null; + + // Where we are in the hierarchy + this.root = ""; + this.clone = function() { + var newObj = new JSZip(); + for (var i in this) { + if (typeof this[i] !== "function") { + newObj[i] = this[i]; + } + } + return newObj; + }; +} +JSZip.prototype = require('./object'); +JSZip.prototype.loadAsync = require('./load'); +JSZip.support = require('./support'); +JSZip.defaults = require('./defaults'); + +// TODO find a better way to handle this version, +// a require('package.json').version doesn't work with webpack, see #327 +JSZip.version = "3.2.0"; + +JSZip.loadAsync = function (content, options) { + return new JSZip().loadAsync(content, options); +}; + +JSZip.external = require("./external"); +module.exports = JSZip; + +},{"./defaults":5,"./external":6,"./load":11,"./object":15,"./support":30}],11:[function(require,module,exports){ +'use strict'; +var utils = require('./utils'); +var external = require("./external"); +var utf8 = require('./utf8'); +var utils = require('./utils'); +var ZipEntries = require('./zipEntries'); +var Crc32Probe = require('./stream/Crc32Probe'); +var nodejsUtils = require("./nodejsUtils"); + +/** + * Check the CRC32 of an entry. + * @param {ZipEntry} zipEntry the zip entry to check. + * @return {Promise} the result. + */ +function checkEntryCRC32(zipEntry) { + return new external.Promise(function (resolve, reject) { + var worker = zipEntry.decompressed.getContentWorker().pipe(new Crc32Probe()); + worker.on("error", function (e) { + reject(e); + }) + .on("end", function () { + if (worker.streamInfo.crc32 !== zipEntry.decompressed.crc32) { + reject(new Error("Corrupted zip : CRC32 mismatch")); + } else { + resolve(); + } + }) + .resume(); + }); +} + +module.exports = function(data, options) { + var zip = this; + options = utils.extend(options || {}, { + base64: false, + checkCRC32: false, + optimizedBinaryString: false, + createFolders: false, + decodeFileName: utf8.utf8decode + }); + + if (nodejsUtils.isNode && nodejsUtils.isStream(data)) { + return external.Promise.reject(new Error("JSZip can't accept a stream when loading a zip file.")); + } + + return utils.prepareContent("the loaded zip file", data, true, options.optimizedBinaryString, options.base64) + .then(function(data) { + var zipEntries = new ZipEntries(options); + zipEntries.load(data); + return zipEntries; + }).then(function checkCRC32(zipEntries) { + var promises = [external.Promise.resolve(zipEntries)]; + var files = zipEntries.files; + if (options.checkCRC32) { + for (var i = 0; i < files.length; i++) { + promises.push(checkEntryCRC32(files[i])); + } + } + return external.Promise.all(promises); + }).then(function addFiles(results) { + var zipEntries = results.shift(); + var files = zipEntries.files; + for (var i = 0; i < files.length; i++) { + var input = files[i]; + zip.file(input.fileNameStr, input.decompressed, { + binary: true, + optimizedBinaryString: true, + date: input.date, + dir: input.dir, + comment : input.fileCommentStr.length ? input.fileCommentStr : null, + unixPermissions : input.unixPermissions, + dosPermissions : input.dosPermissions, + createFolders: options.createFolders + }); + } + if (zipEntries.zipComment.length) { + zip.comment = zipEntries.zipComment; + } + + return zip; + }); +}; + +},{"./external":6,"./nodejsUtils":14,"./stream/Crc32Probe":25,"./utf8":31,"./utils":32,"./zipEntries":33}],12:[function(require,module,exports){ +"use strict"; + +var utils = require('../utils'); +var GenericWorker = require('../stream/GenericWorker'); + +/** + * A worker that use a nodejs stream as source. + * @constructor + * @param {String} filename the name of the file entry for this stream. + * @param {Readable} stream the nodejs stream. + */ +function NodejsStreamInputAdapter(filename, stream) { + GenericWorker.call(this, "Nodejs stream input adapter for " + filename); + this._upstreamEnded = false; + this._bindStream(stream); +} + +utils.inherits(NodejsStreamInputAdapter, GenericWorker); + +/** + * Prepare the stream and bind the callbacks on it. + * Do this ASAP on node 0.10 ! A lazy binding doesn't always work. + * @param {Stream} stream the nodejs stream to use. + */ +NodejsStreamInputAdapter.prototype._bindStream = function (stream) { + var self = this; + this._stream = stream; + stream.pause(); + stream + .on("data", function (chunk) { + self.push({ + data: chunk, + meta : { + percent : 0 + } + }); + }) + .on("error", function (e) { + if(self.isPaused) { + this.generatedError = e; + } else { + self.error(e); + } + }) + .on("end", function () { + if(self.isPaused) { + self._upstreamEnded = true; + } else { + self.end(); + } + }); +}; +NodejsStreamInputAdapter.prototype.pause = function () { + if(!GenericWorker.prototype.pause.call(this)) { + return false; + } + this._stream.pause(); + return true; +}; +NodejsStreamInputAdapter.prototype.resume = function () { + if(!GenericWorker.prototype.resume.call(this)) { + return false; + } + + if(this._upstreamEnded) { + this.end(); + } else { + this._stream.resume(); + } + + return true; +}; + +module.exports = NodejsStreamInputAdapter; + +},{"../stream/GenericWorker":28,"../utils":32}],13:[function(require,module,exports){ +'use strict'; + +var Readable = require('readable-stream').Readable; + +var utils = require('../utils'); +utils.inherits(NodejsStreamOutputAdapter, Readable); + +/** +* A nodejs stream using a worker as source. +* @see the SourceWrapper in http://nodejs.org/api/stream.html +* @constructor +* @param {StreamHelper} helper the helper wrapping the worker +* @param {Object} options the nodejs stream options +* @param {Function} updateCb the update callback. +*/ +function NodejsStreamOutputAdapter(helper, options, updateCb) { + Readable.call(this, options); + this._helper = helper; + + var self = this; + helper.on("data", function (data, meta) { + if (!self.push(data)) { + self._helper.pause(); + } + if(updateCb) { + updateCb(meta); + } + }) + .on("error", function(e) { + self.emit('error', e); + }) + .on("end", function () { + self.push(null); + }); +} + + +NodejsStreamOutputAdapter.prototype._read = function() { + this._helper.resume(); +}; + +module.exports = NodejsStreamOutputAdapter; + +},{"../utils":32,"readable-stream":16}],14:[function(require,module,exports){ +'use strict'; + +module.exports = { + /** + * True if this is running in Nodejs, will be undefined in a browser. + * In a browser, browserify won't include this file and the whole module + * will be resolved an empty object. + */ + isNode : typeof Buffer !== "undefined", + /** + * Create a new nodejs Buffer from an existing content. + * @param {Object} data the data to pass to the constructor. + * @param {String} encoding the encoding to use. + * @return {Buffer} a new Buffer. + */ + newBufferFrom: function(data, encoding) { + if (Buffer.from && Buffer.from !== Uint8Array.from) { + return Buffer.from(data, encoding); + } else { + if (typeof data === "number") { + // Safeguard for old Node.js versions. On newer versions, + // Buffer.from(number) / Buffer(number, encoding) already throw. + throw new Error("The \"data\" argument must not be a number"); + } + return new Buffer(data, encoding); + } + }, + /** + * Create a new nodejs Buffer with the specified size. + * @param {Integer} size the size of the buffer. + * @return {Buffer} a new Buffer. + */ + allocBuffer: function (size) { + if (Buffer.alloc) { + return Buffer.alloc(size); + } else { + var buf = new Buffer(size); + buf.fill(0); + return buf; + } + }, + /** + * Find out if an object is a Buffer. + * @param {Object} b the object to test. + * @return {Boolean} true if the object is a Buffer, false otherwise. + */ + isBuffer : function(b){ + return Buffer.isBuffer(b); + }, + + isStream : function (obj) { + return obj && + typeof obj.on === "function" && + typeof obj.pause === "function" && + typeof obj.resume === "function"; + } +}; + +},{}],15:[function(require,module,exports){ +'use strict'; +var utf8 = require('./utf8'); +var utils = require('./utils'); +var GenericWorker = require('./stream/GenericWorker'); +var StreamHelper = require('./stream/StreamHelper'); +var defaults = require('./defaults'); +var CompressedObject = require('./compressedObject'); +var ZipObject = require('./zipObject'); +var generate = require("./generate"); +var nodejsUtils = require("./nodejsUtils"); +var NodejsStreamInputAdapter = require("./nodejs/NodejsStreamInputAdapter"); + + +/** + * Add a file in the current folder. + * @private + * @param {string} name the name of the file + * @param {String|ArrayBuffer|Uint8Array|Buffer} data the data of the file + * @param {Object} originalOptions the options of the file + * @return {Object} the new file. + */ +var fileAdd = function(name, data, originalOptions) { + // be sure sub folders exist + var dataType = utils.getTypeOf(data), + parent; + + + /* + * Correct options. + */ + + var o = utils.extend(originalOptions || {}, defaults); + o.date = o.date || new Date(); + if (o.compression !== null) { + o.compression = o.compression.toUpperCase(); + } + + if (typeof o.unixPermissions === "string") { + o.unixPermissions = parseInt(o.unixPermissions, 8); + } + + // UNX_IFDIR 0040000 see zipinfo.c + if (o.unixPermissions && (o.unixPermissions & 0x4000)) { + o.dir = true; + } + // Bit 4 Directory + if (o.dosPermissions && (o.dosPermissions & 0x0010)) { + o.dir = true; + } + + if (o.dir) { + name = forceTrailingSlash(name); + } + if (o.createFolders && (parent = parentFolder(name))) { + folderAdd.call(this, parent, true); + } + + var isUnicodeString = dataType === "string" && o.binary === false && o.base64 === false; + if (!originalOptions || typeof originalOptions.binary === "undefined") { + o.binary = !isUnicodeString; + } + + + var isCompressedEmpty = (data instanceof CompressedObject) && data.uncompressedSize === 0; + + if (isCompressedEmpty || o.dir || !data || data.length === 0) { + o.base64 = false; + o.binary = true; + data = ""; + o.compression = "STORE"; + dataType = "string"; + } + + /* + * Convert content to fit. + */ + + var zipObjectContent = null; + if (data instanceof CompressedObject || data instanceof GenericWorker) { + zipObjectContent = data; + } else if (nodejsUtils.isNode && nodejsUtils.isStream(data)) { + zipObjectContent = new NodejsStreamInputAdapter(name, data); + } else { + zipObjectContent = utils.prepareContent(name, data, o.binary, o.optimizedBinaryString, o.base64); + } + + var object = new ZipObject(name, zipObjectContent, o); + this.files[name] = object; + /* + TODO: we can't throw an exception because we have async promises + (we can have a promise of a Date() for example) but returning a + promise is useless because file(name, data) returns the JSZip + object for chaining. Should we break that to allow the user + to catch the error ? + + return external.Promise.resolve(zipObjectContent) + .then(function () { + return object; + }); + */ +}; + +/** + * Find the parent folder of the path. + * @private + * @param {string} path the path to use + * @return {string} the parent folder, or "" + */ +var parentFolder = function (path) { + if (path.slice(-1) === '/') { + path = path.substring(0, path.length - 1); + } + var lastSlash = path.lastIndexOf('/'); + return (lastSlash > 0) ? path.substring(0, lastSlash) : ""; +}; + +/** + * Returns the path with a slash at the end. + * @private + * @param {String} path the path to check. + * @return {String} the path with a trailing slash. + */ +var forceTrailingSlash = function(path) { + // Check the name ends with a / + if (path.slice(-1) !== "/") { + path += "/"; // IE doesn't like substr(-1) + } + return path; +}; + +/** + * Add a (sub) folder in the current folder. + * @private + * @param {string} name the folder's name + * @param {boolean=} [createFolders] If true, automatically create sub + * folders. Defaults to false. + * @return {Object} the new folder. + */ +var folderAdd = function(name, createFolders) { + createFolders = (typeof createFolders !== 'undefined') ? createFolders : defaults.createFolders; + + name = forceTrailingSlash(name); + + // Does this folder already exist? + if (!this.files[name]) { + fileAdd.call(this, name, null, { + dir: true, + createFolders: createFolders + }); + } + return this.files[name]; +}; + +/** +* Cross-window, cross-Node-context regular expression detection +* @param {Object} object Anything +* @return {Boolean} true if the object is a regular expression, +* false otherwise +*/ +function isRegExp(object) { + return Object.prototype.toString.call(object) === "[object RegExp]"; +} + +// return the actual prototype of JSZip +var out = { + /** + * @see loadAsync + */ + load: function() { + throw new Error("This method has been removed in JSZip 3.0, please check the upgrade guide."); + }, + + + /** + * Call a callback function for each entry at this folder level. + * @param {Function} cb the callback function: + * function (relativePath, file) {...} + * It takes 2 arguments : the relative path and the file. + */ + forEach: function(cb) { + var filename, relativePath, file; + for (filename in this.files) { + if (!this.files.hasOwnProperty(filename)) { + continue; + } + file = this.files[filename]; + relativePath = filename.slice(this.root.length, filename.length); + if (relativePath && filename.slice(0, this.root.length) === this.root) { // the file is in the current root + cb(relativePath, file); // TODO reverse the parameters ? need to be clean AND consistent with the filter search fn... + } + } + }, + + /** + * Filter nested files/folders with the specified function. + * @param {Function} search the predicate to use : + * function (relativePath, file) {...} + * It takes 2 arguments : the relative path and the file. + * @return {Array} An array of matching elements. + */ + filter: function(search) { + var result = []; + this.forEach(function (relativePath, entry) { + if (search(relativePath, entry)) { // the file matches the function + result.push(entry); + } + + }); + return result; + }, + + /** + * Add a file to the zip file, or search a file. + * @param {string|RegExp} name The name of the file to add (if data is defined), + * the name of the file to find (if no data) or a regex to match files. + * @param {String|ArrayBuffer|Uint8Array|Buffer} data The file data, either raw or base64 encoded + * @param {Object} o File options + * @return {JSZip|Object|Array} this JSZip object (when adding a file), + * a file (when searching by string) or an array of files (when searching by regex). + */ + file: function(name, data, o) { + if (arguments.length === 1) { + if (isRegExp(name)) { + var regexp = name; + return this.filter(function(relativePath, file) { + return !file.dir && regexp.test(relativePath); + }); + } + else { // text + var obj = this.files[this.root + name]; + if (obj && !obj.dir) { + return obj; + } else { + return null; + } + } + } + else { // more than one argument : we have data ! + name = this.root + name; + fileAdd.call(this, name, data, o); + } + return this; + }, + + /** + * Add a directory to the zip file, or search. + * @param {String|RegExp} arg The name of the directory to add, or a regex to search folders. + * @return {JSZip} an object with the new directory as the root, or an array containing matching folders. + */ + folder: function(arg) { + if (!arg) { + return this; + } + + if (isRegExp(arg)) { + return this.filter(function(relativePath, file) { + return file.dir && arg.test(relativePath); + }); + } + + // else, name is a new folder + var name = this.root + arg; + var newFolder = folderAdd.call(this, name); + + // Allow chaining by returning a new object with this folder as the root + var ret = this.clone(); + ret.root = newFolder.name; + return ret; + }, + + /** + * Delete a file, or a directory and all sub-files, from the zip + * @param {string} name the name of the file to delete + * @return {JSZip} this JSZip object + */ + remove: function(name) { + name = this.root + name; + var file = this.files[name]; + if (!file) { + // Look for any folders + if (name.slice(-1) !== "/") { + name += "/"; + } + file = this.files[name]; + } + + if (file && !file.dir) { + // file + delete this.files[name]; + } else { + // maybe a folder, delete recursively + var kids = this.filter(function(relativePath, file) { + return file.name.slice(0, name.length) === name; + }); + for (var i = 0; i < kids.length; i++) { + delete this.files[kids[i].name]; + } + } + + return this; + }, + + /** + * Generate the complete zip file + * @param {Object} options the options to generate the zip file : + * - compression, "STORE" by default. + * - type, "base64" by default. Values are : string, base64, uint8array, arraybuffer, blob. + * @return {String|Uint8Array|ArrayBuffer|Buffer|Blob} the zip file + */ + generate: function(options) { + throw new Error("This method has been removed in JSZip 3.0, please check the upgrade guide."); + }, + + /** + * Generate the complete zip file as an internal stream. + * @param {Object} options the options to generate the zip file : + * - compression, "STORE" by default. + * - type, "base64" by default. Values are : string, base64, uint8array, arraybuffer, blob. + * @return {StreamHelper} the streamed zip file. + */ + generateInternalStream: function(options) { + var worker, opts = {}; + try { + opts = utils.extend(options || {}, { + streamFiles: false, + compression: "STORE", + compressionOptions : null, + type: "", + platform: "DOS", + comment: null, + mimeType: 'application/zip', + encodeFileName: utf8.utf8encode + }); + + opts.type = opts.type.toLowerCase(); + opts.compression = opts.compression.toUpperCase(); + + // "binarystring" is prefered but the internals use "string". + if(opts.type === "binarystring") { + opts.type = "string"; + } + + if (!opts.type) { + throw new Error("No output type specified."); + } + + utils.checkSupport(opts.type); + + // accept nodejs `process.platform` + if( + opts.platform === 'darwin' || + opts.platform === 'freebsd' || + opts.platform === 'linux' || + opts.platform === 'sunos' + ) { + opts.platform = "UNIX"; + } + if (opts.platform === 'win32') { + opts.platform = "DOS"; + } + + var comment = opts.comment || this.comment || ""; + worker = generate.generateWorker(this, opts, comment); + } catch (e) { + worker = new GenericWorker("error"); + worker.error(e); + } + return new StreamHelper(worker, opts.type || "string", opts.mimeType); + }, + /** + * Generate the complete zip file asynchronously. + * @see generateInternalStream + */ + generateAsync: function(options, onUpdate) { + return this.generateInternalStream(options).accumulate(onUpdate); + }, + /** + * Generate the complete zip file asynchronously. + * @see generateInternalStream + */ + generateNodeStream: function(options, onUpdate) { + options = options || {}; + if (!options.type) { + options.type = "nodebuffer"; + } + return this.generateInternalStream(options).toNodejsStream(onUpdate); + } +}; +module.exports = out; + +},{"./compressedObject":2,"./defaults":5,"./generate":9,"./nodejs/NodejsStreamInputAdapter":12,"./nodejsUtils":14,"./stream/GenericWorker":28,"./stream/StreamHelper":29,"./utf8":31,"./utils":32,"./zipObject":35}],16:[function(require,module,exports){ +/* + * This file is used by module bundlers (browserify/webpack/etc) when + * including a stream implementation. We use "readable-stream" to get a + * consistent behavior between nodejs versions but bundlers often have a shim + * for "stream". Using this shim greatly improve the compatibility and greatly + * reduce the final size of the bundle (only one stream implementation, not + * two). + */ +module.exports = require("stream"); + +},{"stream":undefined}],17:[function(require,module,exports){ +'use strict'; +var DataReader = require('./DataReader'); +var utils = require('../utils'); + +function ArrayReader(data) { + DataReader.call(this, data); + for(var i = 0; i < this.data.length; i++) { + data[i] = data[i] & 0xFF; + } +} +utils.inherits(ArrayReader, DataReader); +/** + * @see DataReader.byteAt + */ +ArrayReader.prototype.byteAt = function(i) { + return this.data[this.zero + i]; +}; +/** + * @see DataReader.lastIndexOfSignature + */ +ArrayReader.prototype.lastIndexOfSignature = function(sig) { + var sig0 = sig.charCodeAt(0), + sig1 = sig.charCodeAt(1), + sig2 = sig.charCodeAt(2), + sig3 = sig.charCodeAt(3); + for (var i = this.length - 4; i >= 0; --i) { + if (this.data[i] === sig0 && this.data[i + 1] === sig1 && this.data[i + 2] === sig2 && this.data[i + 3] === sig3) { + return i - this.zero; + } + } + + return -1; +}; +/** + * @see DataReader.readAndCheckSignature + */ +ArrayReader.prototype.readAndCheckSignature = function (sig) { + var sig0 = sig.charCodeAt(0), + sig1 = sig.charCodeAt(1), + sig2 = sig.charCodeAt(2), + sig3 = sig.charCodeAt(3), + data = this.readData(4); + return sig0 === data[0] && sig1 === data[1] && sig2 === data[2] && sig3 === data[3]; +}; +/** + * @see DataReader.readData + */ +ArrayReader.prototype.readData = function(size) { + this.checkOffset(size); + if(size === 0) { + return []; + } + var result = this.data.slice(this.zero + this.index, this.zero + this.index + size); + this.index += size; + return result; +}; +module.exports = ArrayReader; + +},{"../utils":32,"./DataReader":18}],18:[function(require,module,exports){ +'use strict'; +var utils = require('../utils'); + +function DataReader(data) { + this.data = data; // type : see implementation + this.length = data.length; + this.index = 0; + this.zero = 0; +} +DataReader.prototype = { + /** + * Check that the offset will not go too far. + * @param {string} offset the additional offset to check. + * @throws {Error} an Error if the offset is out of bounds. + */ + checkOffset: function(offset) { + this.checkIndex(this.index + offset); + }, + /** + * Check that the specified index will not be too far. + * @param {string} newIndex the index to check. + * @throws {Error} an Error if the index is out of bounds. + */ + checkIndex: function(newIndex) { + if (this.length < this.zero + newIndex || newIndex < 0) { + throw new Error("End of data reached (data length = " + this.length + ", asked index = " + (newIndex) + "). Corrupted zip ?"); + } + }, + /** + * Change the index. + * @param {number} newIndex The new index. + * @throws {Error} if the new index is out of the data. + */ + setIndex: function(newIndex) { + this.checkIndex(newIndex); + this.index = newIndex; + }, + /** + * Skip the next n bytes. + * @param {number} n the number of bytes to skip. + * @throws {Error} if the new index is out of the data. + */ + skip: function(n) { + this.setIndex(this.index + n); + }, + /** + * Get the byte at the specified index. + * @param {number} i the index to use. + * @return {number} a byte. + */ + byteAt: function(i) { + // see implementations + }, + /** + * Get the next number with a given byte size. + * @param {number} size the number of bytes to read. + * @return {number} the corresponding number. + */ + readInt: function(size) { + var result = 0, + i; + this.checkOffset(size); + for (i = this.index + size - 1; i >= this.index; i--) { + result = (result << 8) + this.byteAt(i); + } + this.index += size; + return result; + }, + /** + * Get the next string with a given byte size. + * @param {number} size the number of bytes to read. + * @return {string} the corresponding string. + */ + readString: function(size) { + return utils.transformTo("string", this.readData(size)); + }, + /** + * Get raw data without conversion, bytes. + * @param {number} size the number of bytes to read. + * @return {Object} the raw data, implementation specific. + */ + readData: function(size) { + // see implementations + }, + /** + * Find the last occurence of a zip signature (4 bytes). + * @param {string} sig the signature to find. + * @return {number} the index of the last occurence, -1 if not found. + */ + lastIndexOfSignature: function(sig) { + // see implementations + }, + /** + * Read the signature (4 bytes) at the current position and compare it with sig. + * @param {string} sig the expected signature + * @return {boolean} true if the signature matches, false otherwise. + */ + readAndCheckSignature: function(sig) { + // see implementations + }, + /** + * Get the next date. + * @return {Date} the date. + */ + readDate: function() { + var dostime = this.readInt(4); + return new Date(Date.UTC( + ((dostime >> 25) & 0x7f) + 1980, // year + ((dostime >> 21) & 0x0f) - 1, // month + (dostime >> 16) & 0x1f, // day + (dostime >> 11) & 0x1f, // hour + (dostime >> 5) & 0x3f, // minute + (dostime & 0x1f) << 1)); // second + } +}; +module.exports = DataReader; + +},{"../utils":32}],19:[function(require,module,exports){ +'use strict'; +var Uint8ArrayReader = require('./Uint8ArrayReader'); +var utils = require('../utils'); + +function NodeBufferReader(data) { + Uint8ArrayReader.call(this, data); +} +utils.inherits(NodeBufferReader, Uint8ArrayReader); + +/** + * @see DataReader.readData + */ +NodeBufferReader.prototype.readData = function(size) { + this.checkOffset(size); + var result = this.data.slice(this.zero + this.index, this.zero + this.index + size); + this.index += size; + return result; +}; +module.exports = NodeBufferReader; + +},{"../utils":32,"./Uint8ArrayReader":21}],20:[function(require,module,exports){ +'use strict'; +var DataReader = require('./DataReader'); +var utils = require('../utils'); + +function StringReader(data) { + DataReader.call(this, data); +} +utils.inherits(StringReader, DataReader); +/** + * @see DataReader.byteAt + */ +StringReader.prototype.byteAt = function(i) { + return this.data.charCodeAt(this.zero + i); +}; +/** + * @see DataReader.lastIndexOfSignature + */ +StringReader.prototype.lastIndexOfSignature = function(sig) { + return this.data.lastIndexOf(sig) - this.zero; +}; +/** + * @see DataReader.readAndCheckSignature + */ +StringReader.prototype.readAndCheckSignature = function (sig) { + var data = this.readData(4); + return sig === data; +}; +/** + * @see DataReader.readData + */ +StringReader.prototype.readData = function(size) { + this.checkOffset(size); + // this will work because the constructor applied the "& 0xff" mask. + var result = this.data.slice(this.zero + this.index, this.zero + this.index + size); + this.index += size; + return result; +}; +module.exports = StringReader; + +},{"../utils":32,"./DataReader":18}],21:[function(require,module,exports){ +'use strict'; +var ArrayReader = require('./ArrayReader'); +var utils = require('../utils'); + +function Uint8ArrayReader(data) { + ArrayReader.call(this, data); +} +utils.inherits(Uint8ArrayReader, ArrayReader); +/** + * @see DataReader.readData + */ +Uint8ArrayReader.prototype.readData = function(size) { + this.checkOffset(size); + if(size === 0) { + // in IE10, when using subarray(idx, idx), we get the array [0x00] instead of []. + return new Uint8Array(0); + } + var result = this.data.subarray(this.zero + this.index, this.zero + this.index + size); + this.index += size; + return result; +}; +module.exports = Uint8ArrayReader; + +},{"../utils":32,"./ArrayReader":17}],22:[function(require,module,exports){ +'use strict'; + +var utils = require('../utils'); +var support = require('../support'); +var ArrayReader = require('./ArrayReader'); +var StringReader = require('./StringReader'); +var NodeBufferReader = require('./NodeBufferReader'); +var Uint8ArrayReader = require('./Uint8ArrayReader'); + +/** + * Create a reader adapted to the data. + * @param {String|ArrayBuffer|Uint8Array|Buffer} data the data to read. + * @return {DataReader} the data reader. + */ +module.exports = function (data) { + var type = utils.getTypeOf(data); + utils.checkSupport(type); + if (type === "string" && !support.uint8array) { + return new StringReader(data); + } + if (type === "nodebuffer") { + return new NodeBufferReader(data); + } + if (support.uint8array) { + return new Uint8ArrayReader(utils.transformTo("uint8array", data)); + } + return new ArrayReader(utils.transformTo("array", data)); +}; + +},{"../support":30,"../utils":32,"./ArrayReader":17,"./NodeBufferReader":19,"./StringReader":20,"./Uint8ArrayReader":21}],23:[function(require,module,exports){ +'use strict'; +exports.LOCAL_FILE_HEADER = "PK\x03\x04"; +exports.CENTRAL_FILE_HEADER = "PK\x01\x02"; +exports.CENTRAL_DIRECTORY_END = "PK\x05\x06"; +exports.ZIP64_CENTRAL_DIRECTORY_LOCATOR = "PK\x06\x07"; +exports.ZIP64_CENTRAL_DIRECTORY_END = "PK\x06\x06"; +exports.DATA_DESCRIPTOR = "PK\x07\x08"; + +},{}],24:[function(require,module,exports){ +'use strict'; + +var GenericWorker = require('./GenericWorker'); +var utils = require('../utils'); + +/** + * A worker which convert chunks to a specified type. + * @constructor + * @param {String} destType the destination type. + */ +function ConvertWorker(destType) { + GenericWorker.call(this, "ConvertWorker to " + destType); + this.destType = destType; +} +utils.inherits(ConvertWorker, GenericWorker); + +/** + * @see GenericWorker.processChunk + */ +ConvertWorker.prototype.processChunk = function (chunk) { + this.push({ + data : utils.transformTo(this.destType, chunk.data), + meta : chunk.meta + }); +}; +module.exports = ConvertWorker; + +},{"../utils":32,"./GenericWorker":28}],25:[function(require,module,exports){ +'use strict'; + +var GenericWorker = require('./GenericWorker'); +var crc32 = require('../crc32'); +var utils = require('../utils'); + +/** + * A worker which calculate the crc32 of the data flowing through. + * @constructor + */ +function Crc32Probe() { + GenericWorker.call(this, "Crc32Probe"); + this.withStreamInfo("crc32", 0); +} +utils.inherits(Crc32Probe, GenericWorker); + +/** + * @see GenericWorker.processChunk + */ +Crc32Probe.prototype.processChunk = function (chunk) { + this.streamInfo.crc32 = crc32(chunk.data, this.streamInfo.crc32 || 0); + this.push(chunk); +}; +module.exports = Crc32Probe; + +},{"../crc32":4,"../utils":32,"./GenericWorker":28}],26:[function(require,module,exports){ +'use strict'; + +var utils = require('../utils'); +var GenericWorker = require('./GenericWorker'); + +/** + * A worker which calculate the total length of the data flowing through. + * @constructor + * @param {String} propName the name used to expose the length + */ +function DataLengthProbe(propName) { + GenericWorker.call(this, "DataLengthProbe for " + propName); + this.propName = propName; + this.withStreamInfo(propName, 0); +} +utils.inherits(DataLengthProbe, GenericWorker); + +/** + * @see GenericWorker.processChunk + */ +DataLengthProbe.prototype.processChunk = function (chunk) { + if(chunk) { + var length = this.streamInfo[this.propName] || 0; + this.streamInfo[this.propName] = length + chunk.data.length; + } + GenericWorker.prototype.processChunk.call(this, chunk); +}; +module.exports = DataLengthProbe; + + +},{"../utils":32,"./GenericWorker":28}],27:[function(require,module,exports){ +'use strict'; + +var utils = require('../utils'); +var GenericWorker = require('./GenericWorker'); + +// the size of the generated chunks +// TODO expose this as a public variable +var DEFAULT_BLOCK_SIZE = 16 * 1024; + +/** + * A worker that reads a content and emits chunks. + * @constructor + * @param {Promise} dataP the promise of the data to split + */ +function DataWorker(dataP) { + GenericWorker.call(this, "DataWorker"); + var self = this; + this.dataIsReady = false; + this.index = 0; + this.max = 0; + this.data = null; + this.type = ""; + + this._tickScheduled = false; + + dataP.then(function (data) { + self.dataIsReady = true; + self.data = data; + self.max = data && data.length || 0; + self.type = utils.getTypeOf(data); + if(!self.isPaused) { + self._tickAndRepeat(); + } + }, function (e) { + self.error(e); + }); +} + +utils.inherits(DataWorker, GenericWorker); + +/** + * @see GenericWorker.cleanUp + */ +DataWorker.prototype.cleanUp = function () { + GenericWorker.prototype.cleanUp.call(this); + this.data = null; +}; + +/** + * @see GenericWorker.resume + */ +DataWorker.prototype.resume = function () { + if(!GenericWorker.prototype.resume.call(this)) { + return false; + } + + if (!this._tickScheduled && this.dataIsReady) { + this._tickScheduled = true; + utils.delay(this._tickAndRepeat, [], this); + } + return true; +}; + +/** + * Trigger a tick a schedule an other call to this function. + */ +DataWorker.prototype._tickAndRepeat = function() { + this._tickScheduled = false; + if(this.isPaused || this.isFinished) { + return; + } + this._tick(); + if(!this.isFinished) { + utils.delay(this._tickAndRepeat, [], this); + this._tickScheduled = true; + } +}; + +/** + * Read and push a chunk. + */ +DataWorker.prototype._tick = function() { + + if(this.isPaused || this.isFinished) { + return false; + } + + var size = DEFAULT_BLOCK_SIZE; + var data = null, nextIndex = Math.min(this.max, this.index + size); + if (this.index >= this.max) { + // EOF + return this.end(); + } else { + switch(this.type) { + case "string": + data = this.data.substring(this.index, nextIndex); + break; + case "uint8array": + data = this.data.subarray(this.index, nextIndex); + break; + case "array": + case "nodebuffer": + data = this.data.slice(this.index, nextIndex); + break; + } + this.index = nextIndex; + return this.push({ + data : data, + meta : { + percent : this.max ? this.index / this.max * 100 : 0 + } + }); + } +}; + +module.exports = DataWorker; + +},{"../utils":32,"./GenericWorker":28}],28:[function(require,module,exports){ +'use strict'; + +/** + * A worker that does nothing but passing chunks to the next one. This is like + * a nodejs stream but with some differences. On the good side : + * - it works on IE 6-9 without any issue / polyfill + * - it weights less than the full dependencies bundled with browserify + * - it forwards errors (no need to declare an error handler EVERYWHERE) + * + * A chunk is an object with 2 attributes : `meta` and `data`. The former is an + * object containing anything (`percent` for example), see each worker for more + * details. The latter is the real data (String, Uint8Array, etc). + * + * @constructor + * @param {String} name the name of the stream (mainly used for debugging purposes) + */ +function GenericWorker(name) { + // the name of the worker + this.name = name || "default"; + // an object containing metadata about the workers chain + this.streamInfo = {}; + // an error which happened when the worker was paused + this.generatedError = null; + // an object containing metadata to be merged by this worker into the general metadata + this.extraStreamInfo = {}; + // true if the stream is paused (and should not do anything), false otherwise + this.isPaused = true; + // true if the stream is finished (and should not do anything), false otherwise + this.isFinished = false; + // true if the stream is locked to prevent further structure updates (pipe), false otherwise + this.isLocked = false; + // the event listeners + this._listeners = { + 'data':[], + 'end':[], + 'error':[] + }; + // the previous worker, if any + this.previous = null; +} + +GenericWorker.prototype = { + /** + * Push a chunk to the next workers. + * @param {Object} chunk the chunk to push + */ + push : function (chunk) { + this.emit("data", chunk); + }, + /** + * End the stream. + * @return {Boolean} true if this call ended the worker, false otherwise. + */ + end : function () { + if (this.isFinished) { + return false; + } + + this.flush(); + try { + this.emit("end"); + this.cleanUp(); + this.isFinished = true; + } catch (e) { + this.emit("error", e); + } + return true; + }, + /** + * End the stream with an error. + * @param {Error} e the error which caused the premature end. + * @return {Boolean} true if this call ended the worker with an error, false otherwise. + */ + error : function (e) { + if (this.isFinished) { + return false; + } + + if(this.isPaused) { + this.generatedError = e; + } else { + this.isFinished = true; + + this.emit("error", e); + + // in the workers chain exploded in the middle of the chain, + // the error event will go downward but we also need to notify + // workers upward that there has been an error. + if(this.previous) { + this.previous.error(e); + } + + this.cleanUp(); + } + return true; + }, + /** + * Add a callback on an event. + * @param {String} name the name of the event (data, end, error) + * @param {Function} listener the function to call when the event is triggered + * @return {GenericWorker} the current object for chainability + */ + on : function (name, listener) { + this._listeners[name].push(listener); + return this; + }, + /** + * Clean any references when a worker is ending. + */ + cleanUp : function () { + this.streamInfo = this.generatedError = this.extraStreamInfo = null; + this._listeners = []; + }, + /** + * Trigger an event. This will call registered callback with the provided arg. + * @param {String} name the name of the event (data, end, error) + * @param {Object} arg the argument to call the callback with. + */ + emit : function (name, arg) { + if (this._listeners[name]) { + for(var i = 0; i < this._listeners[name].length; i++) { + this._listeners[name][i].call(this, arg); + } + } + }, + /** + * Chain a worker with an other. + * @param {Worker} next the worker receiving events from the current one. + * @return {worker} the next worker for chainability + */ + pipe : function (next) { + return next.registerPrevious(this); + }, + /** + * Same as `pipe` in the other direction. + * Using an API with `pipe(next)` is very easy. + * Implementing the API with the point of view of the next one registering + * a source is easier, see the ZipFileWorker. + * @param {Worker} previous the previous worker, sending events to this one + * @return {Worker} the current worker for chainability + */ + registerPrevious : function (previous) { + if (this.isLocked) { + throw new Error("The stream '" + this + "' has already been used."); + } + + // sharing the streamInfo... + this.streamInfo = previous.streamInfo; + // ... and adding our own bits + this.mergeStreamInfo(); + this.previous = previous; + var self = this; + previous.on('data', function (chunk) { + self.processChunk(chunk); + }); + previous.on('end', function () { + self.end(); + }); + previous.on('error', function (e) { + self.error(e); + }); + return this; + }, + /** + * Pause the stream so it doesn't send events anymore. + * @return {Boolean} true if this call paused the worker, false otherwise. + */ + pause : function () { + if(this.isPaused || this.isFinished) { + return false; + } + this.isPaused = true; + + if(this.previous) { + this.previous.pause(); + } + return true; + }, + /** + * Resume a paused stream. + * @return {Boolean} true if this call resumed the worker, false otherwise. + */ + resume : function () { + if(!this.isPaused || this.isFinished) { + return false; + } + this.isPaused = false; + + // if true, the worker tried to resume but failed + var withError = false; + if(this.generatedError) { + this.error(this.generatedError); + withError = true; + } + if(this.previous) { + this.previous.resume(); + } + + return !withError; + }, + /** + * Flush any remaining bytes as the stream is ending. + */ + flush : function () {}, + /** + * Process a chunk. This is usually the method overridden. + * @param {Object} chunk the chunk to process. + */ + processChunk : function(chunk) { + this.push(chunk); + }, + /** + * Add a key/value to be added in the workers chain streamInfo once activated. + * @param {String} key the key to use + * @param {Object} value the associated value + * @return {Worker} the current worker for chainability + */ + withStreamInfo : function (key, value) { + this.extraStreamInfo[key] = value; + this.mergeStreamInfo(); + return this; + }, + /** + * Merge this worker's streamInfo into the chain's streamInfo. + */ + mergeStreamInfo : function () { + for(var key in this.extraStreamInfo) { + if (!this.extraStreamInfo.hasOwnProperty(key)) { + continue; + } + this.streamInfo[key] = this.extraStreamInfo[key]; + } + }, + + /** + * Lock the stream to prevent further updates on the workers chain. + * After calling this method, all calls to pipe will fail. + */ + lock: function () { + if (this.isLocked) { + throw new Error("The stream '" + this + "' has already been used."); + } + this.isLocked = true; + if (this.previous) { + this.previous.lock(); + } + }, + + /** + * + * Pretty print the workers chain. + */ + toString : function () { + var me = "Worker " + this.name; + if (this.previous) { + return this.previous + " -> " + me; + } else { + return me; + } + } +}; + +module.exports = GenericWorker; + +},{}],29:[function(require,module,exports){ +'use strict'; + +var utils = require('../utils'); +var ConvertWorker = require('./ConvertWorker'); +var GenericWorker = require('./GenericWorker'); +var base64 = require('../base64'); +var support = require("../support"); +var external = require("../external"); + +var NodejsStreamOutputAdapter = null; +if (support.nodestream) { + try { + NodejsStreamOutputAdapter = require('../nodejs/NodejsStreamOutputAdapter'); + } catch(e) {} +} + +/** + * Apply the final transformation of the data. If the user wants a Blob for + * example, it's easier to work with an U8intArray and finally do the + * ArrayBuffer/Blob conversion. + * @param {String} type the name of the final type + * @param {String|Uint8Array|Buffer} content the content to transform + * @param {String} mimeType the mime type of the content, if applicable. + * @return {String|Uint8Array|ArrayBuffer|Buffer|Blob} the content in the right format. + */ +function transformZipOutput(type, content, mimeType) { + switch(type) { + case "blob" : + return utils.newBlob(utils.transformTo("arraybuffer", content), mimeType); + case "base64" : + return base64.encode(content); + default : + return utils.transformTo(type, content); + } +} + +/** + * Concatenate an array of data of the given type. + * @param {String} type the type of the data in the given array. + * @param {Array} dataArray the array containing the data chunks to concatenate + * @return {String|Uint8Array|Buffer} the concatenated data + * @throws Error if the asked type is unsupported + */ +function concat (type, dataArray) { + var i, index = 0, res = null, totalLength = 0; + for(i = 0; i < dataArray.length; i++) { + totalLength += dataArray[i].length; + } + switch(type) { + case "string": + return dataArray.join(""); + case "array": + return Array.prototype.concat.apply([], dataArray); + case "uint8array": + res = new Uint8Array(totalLength); + for(i = 0; i < dataArray.length; i++) { + res.set(dataArray[i], index); + index += dataArray[i].length; + } + return res; + case "nodebuffer": + return Buffer.concat(dataArray); + default: + throw new Error("concat : unsupported type '" + type + "'"); + } +} + +/** + * Listen a StreamHelper, accumulate its content and concatenate it into a + * complete block. + * @param {StreamHelper} helper the helper to use. + * @param {Function} updateCallback a callback called on each update. Called + * with one arg : + * - the metadata linked to the update received. + * @return Promise the promise for the accumulation. + */ +function accumulate(helper, updateCallback) { + return new external.Promise(function (resolve, reject){ + var dataArray = []; + var chunkType = helper._internalType, + resultType = helper._outputType, + mimeType = helper._mimeType; + helper + .on('data', function (data, meta) { + dataArray.push(data); + if(updateCallback) { + updateCallback(meta); + } + }) + .on('error', function(err) { + dataArray = []; + reject(err); + }) + .on('end', function (){ + try { + var result = transformZipOutput(resultType, concat(chunkType, dataArray), mimeType); + resolve(result); + } catch (e) { + reject(e); + } + dataArray = []; + }) + .resume(); + }); +} + +/** + * An helper to easily use workers outside of JSZip. + * @constructor + * @param {Worker} worker the worker to wrap + * @param {String} outputType the type of data expected by the use + * @param {String} mimeType the mime type of the content, if applicable. + */ +function StreamHelper(worker, outputType, mimeType) { + var internalType = outputType; + switch(outputType) { + case "blob": + case "arraybuffer": + internalType = "uint8array"; + break; + case "base64": + internalType = "string"; + break; + } + + try { + // the type used internally + this._internalType = internalType; + // the type used to output results + this._outputType = outputType; + // the mime type + this._mimeType = mimeType; + utils.checkSupport(internalType); + this._worker = worker.pipe(new ConvertWorker(internalType)); + // the last workers can be rewired without issues but we need to + // prevent any updates on previous workers. + worker.lock(); + } catch(e) { + this._worker = new GenericWorker("error"); + this._worker.error(e); + } +} + +StreamHelper.prototype = { + /** + * Listen a StreamHelper, accumulate its content and concatenate it into a + * complete block. + * @param {Function} updateCb the update callback. + * @return Promise the promise for the accumulation. + */ + accumulate : function (updateCb) { + return accumulate(this, updateCb); + }, + /** + * Add a listener on an event triggered on a stream. + * @param {String} evt the name of the event + * @param {Function} fn the listener + * @return {StreamHelper} the current helper. + */ + on : function (evt, fn) { + var self = this; + + if(evt === "data") { + this._worker.on(evt, function (chunk) { + fn.call(self, chunk.data, chunk.meta); + }); + } else { + this._worker.on(evt, function () { + utils.delay(fn, arguments, self); + }); + } + return this; + }, + /** + * Resume the flow of chunks. + * @return {StreamHelper} the current helper. + */ + resume : function () { + utils.delay(this._worker.resume, [], this._worker); + return this; + }, + /** + * Pause the flow of chunks. + * @return {StreamHelper} the current helper. + */ + pause : function () { + this._worker.pause(); + return this; + }, + /** + * Return a nodejs stream for this helper. + * @param {Function} updateCb the update callback. + * @return {NodejsStreamOutputAdapter} the nodejs stream. + */ + toNodejsStream : function (updateCb) { + utils.checkSupport("nodestream"); + if (this._outputType !== "nodebuffer") { + // an object stream containing blob/arraybuffer/uint8array/string + // is strange and I don't know if it would be useful. + // I you find this comment and have a good usecase, please open a + // bug report ! + throw new Error(this._outputType + " is not supported by this method"); + } + + return new NodejsStreamOutputAdapter(this, { + objectMode : this._outputType !== "nodebuffer" + }, updateCb); + } +}; + + +module.exports = StreamHelper; + +},{"../base64":1,"../external":6,"../nodejs/NodejsStreamOutputAdapter":13,"../support":30,"../utils":32,"./ConvertWorker":24,"./GenericWorker":28}],30:[function(require,module,exports){ +'use strict'; + +exports.base64 = true; +exports.array = true; +exports.string = true; +exports.arraybuffer = typeof ArrayBuffer !== "undefined" && typeof Uint8Array !== "undefined"; +exports.nodebuffer = typeof Buffer !== "undefined"; +// contains true if JSZip can read/generate Uint8Array, false otherwise. +exports.uint8array = typeof Uint8Array !== "undefined"; + +if (typeof ArrayBuffer === "undefined") { + exports.blob = false; +} +else { + var buffer = new ArrayBuffer(0); + try { + exports.blob = new Blob([buffer], { + type: "application/zip" + }).size === 0; + } + catch (e) { + try { + var Builder = self.BlobBuilder || self.WebKitBlobBuilder || self.MozBlobBuilder || self.MSBlobBuilder; + var builder = new Builder(); + builder.append(buffer); + exports.blob = builder.getBlob('application/zip').size === 0; + } + catch (e) { + exports.blob = false; + } + } +} + +try { + exports.nodestream = !!require('readable-stream').Readable; +} catch(e) { + exports.nodestream = false; +} + +},{"readable-stream":16}],31:[function(require,module,exports){ +'use strict'; + +var utils = require('./utils'); +var support = require('./support'); +var nodejsUtils = require('./nodejsUtils'); +var GenericWorker = require('./stream/GenericWorker'); + +/** + * The following functions come from pako, from pako/lib/utils/strings + * released under the MIT license, see pako https://github.com/nodeca/pako/ + */ + +// Table with utf8 lengths (calculated by first byte of sequence) +// Note, that 5 & 6-byte values and some 4-byte values can not be represented in JS, +// because max possible codepoint is 0x10ffff +var _utf8len = new Array(256); +for (var i=0; i<256; i++) { + _utf8len[i] = (i >= 252 ? 6 : i >= 248 ? 5 : i >= 240 ? 4 : i >= 224 ? 3 : i >= 192 ? 2 : 1); +} +_utf8len[254]=_utf8len[254]=1; // Invalid sequence start + +// convert string to array (typed, when possible) +var string2buf = function (str) { + var buf, c, c2, m_pos, i, str_len = str.length, buf_len = 0; + + // count binary size + for (m_pos = 0; m_pos < str_len; m_pos++) { + c = str.charCodeAt(m_pos); + if ((c & 0xfc00) === 0xd800 && (m_pos+1 < str_len)) { + c2 = str.charCodeAt(m_pos+1); + if ((c2 & 0xfc00) === 0xdc00) { + c = 0x10000 + ((c - 0xd800) << 10) + (c2 - 0xdc00); + m_pos++; + } + } + buf_len += c < 0x80 ? 1 : c < 0x800 ? 2 : c < 0x10000 ? 3 : 4; + } + + // allocate buffer + if (support.uint8array) { + buf = new Uint8Array(buf_len); + } else { + buf = new Array(buf_len); + } + + // convert + for (i=0, m_pos = 0; i < buf_len; m_pos++) { + c = str.charCodeAt(m_pos); + if ((c & 0xfc00) === 0xd800 && (m_pos+1 < str_len)) { + c2 = str.charCodeAt(m_pos+1); + if ((c2 & 0xfc00) === 0xdc00) { + c = 0x10000 + ((c - 0xd800) << 10) + (c2 - 0xdc00); + m_pos++; + } + } + if (c < 0x80) { + /* one byte */ + buf[i++] = c; + } else if (c < 0x800) { + /* two bytes */ + buf[i++] = 0xC0 | (c >>> 6); + buf[i++] = 0x80 | (c & 0x3f); + } else if (c < 0x10000) { + /* three bytes */ + buf[i++] = 0xE0 | (c >>> 12); + buf[i++] = 0x80 | (c >>> 6 & 0x3f); + buf[i++] = 0x80 | (c & 0x3f); + } else { + /* four bytes */ + buf[i++] = 0xf0 | (c >>> 18); + buf[i++] = 0x80 | (c >>> 12 & 0x3f); + buf[i++] = 0x80 | (c >>> 6 & 0x3f); + buf[i++] = 0x80 | (c & 0x3f); + } + } + + return buf; +}; + +// Calculate max possible position in utf8 buffer, +// that will not break sequence. If that's not possible +// - (very small limits) return max size as is. +// +// buf[] - utf8 bytes array +// max - length limit (mandatory); +var utf8border = function(buf, max) { + var pos; + + max = max || buf.length; + if (max > buf.length) { max = buf.length; } + + // go back from last position, until start of sequence found + pos = max-1; + while (pos >= 0 && (buf[pos] & 0xC0) === 0x80) { pos--; } + + // Fuckup - very small and broken sequence, + // return max, because we should return something anyway. + if (pos < 0) { return max; } + + // If we came to start of buffer - that means vuffer is too small, + // return max too. + if (pos === 0) { return max; } + + return (pos + _utf8len[buf[pos]] > max) ? pos : max; +}; + +// convert array to string +var buf2string = function (buf) { + var str, i, out, c, c_len; + var len = buf.length; + + // Reserve max possible length (2 words per char) + // NB: by unknown reasons, Array is significantly faster for + // String.fromCharCode.apply than Uint16Array. + var utf16buf = new Array(len*2); + + for (out=0, i=0; i 4) { utf16buf[out++] = 0xfffd; i += c_len-1; continue; } + + // apply mask on first byte + c &= c_len === 2 ? 0x1f : c_len === 3 ? 0x0f : 0x07; + // join the rest + while (c_len > 1 && i < len) { + c = (c << 6) | (buf[i++] & 0x3f); + c_len--; + } + + // terminated by end of string? + if (c_len > 1) { utf16buf[out++] = 0xfffd; continue; } + + if (c < 0x10000) { + utf16buf[out++] = c; + } else { + c -= 0x10000; + utf16buf[out++] = 0xd800 | ((c >> 10) & 0x3ff); + utf16buf[out++] = 0xdc00 | (c & 0x3ff); + } + } + + // shrinkBuf(utf16buf, out) + if (utf16buf.length !== out) { + if(utf16buf.subarray) { + utf16buf = utf16buf.subarray(0, out); + } else { + utf16buf.length = out; + } + } + + // return String.fromCharCode.apply(null, utf16buf); + return utils.applyFromCharCode(utf16buf); +}; + + +// That's all for the pako functions. + + +/** + * Transform a javascript string into an array (typed if possible) of bytes, + * UTF-8 encoded. + * @param {String} str the string to encode + * @return {Array|Uint8Array|Buffer} the UTF-8 encoded string. + */ +exports.utf8encode = function utf8encode(str) { + if (support.nodebuffer) { + return nodejsUtils.newBufferFrom(str, "utf-8"); + } + + return string2buf(str); +}; + + +/** + * Transform a bytes array (or a representation) representing an UTF-8 encoded + * string into a javascript string. + * @param {Array|Uint8Array|Buffer} buf the data de decode + * @return {String} the decoded string. + */ +exports.utf8decode = function utf8decode(buf) { + if (support.nodebuffer) { + return utils.transformTo("nodebuffer", buf).toString("utf-8"); + } + + buf = utils.transformTo(support.uint8array ? "uint8array" : "array", buf); + + return buf2string(buf); +}; + +/** + * A worker to decode utf8 encoded binary chunks into string chunks. + * @constructor + */ +function Utf8DecodeWorker() { + GenericWorker.call(this, "utf-8 decode"); + // the last bytes if a chunk didn't end with a complete codepoint. + this.leftOver = null; +} +utils.inherits(Utf8DecodeWorker, GenericWorker); + +/** + * @see GenericWorker.processChunk + */ +Utf8DecodeWorker.prototype.processChunk = function (chunk) { + + var data = utils.transformTo(support.uint8array ? "uint8array" : "array", chunk.data); + + // 1st step, re-use what's left of the previous chunk + if (this.leftOver && this.leftOver.length) { + if(support.uint8array) { + var previousData = data; + data = new Uint8Array(previousData.length + this.leftOver.length); + data.set(this.leftOver, 0); + data.set(previousData, this.leftOver.length); + } else { + data = this.leftOver.concat(data); + } + this.leftOver = null; + } + + var nextBoundary = utf8border(data); + var usableData = data; + if (nextBoundary !== data.length) { + if (support.uint8array) { + usableData = data.subarray(0, nextBoundary); + this.leftOver = data.subarray(nextBoundary, data.length); + } else { + usableData = data.slice(0, nextBoundary); + this.leftOver = data.slice(nextBoundary, data.length); + } + } + + this.push({ + data : exports.utf8decode(usableData), + meta : chunk.meta + }); +}; + +/** + * @see GenericWorker.flush + */ +Utf8DecodeWorker.prototype.flush = function () { + if(this.leftOver && this.leftOver.length) { + this.push({ + data : exports.utf8decode(this.leftOver), + meta : {} + }); + this.leftOver = null; + } +}; +exports.Utf8DecodeWorker = Utf8DecodeWorker; + +/** + * A worker to endcode string chunks into utf8 encoded binary chunks. + * @constructor + */ +function Utf8EncodeWorker() { + GenericWorker.call(this, "utf-8 encode"); +} +utils.inherits(Utf8EncodeWorker, GenericWorker); + +/** + * @see GenericWorker.processChunk + */ +Utf8EncodeWorker.prototype.processChunk = function (chunk) { + this.push({ + data : exports.utf8encode(chunk.data), + meta : chunk.meta + }); +}; +exports.Utf8EncodeWorker = Utf8EncodeWorker; + +},{"./nodejsUtils":14,"./stream/GenericWorker":28,"./support":30,"./utils":32}],32:[function(require,module,exports){ +'use strict'; + +var support = require('./support'); +var base64 = require('./base64'); +var nodejsUtils = require('./nodejsUtils'); +var setImmediate = require('set-immediate-shim'); +var external = require("./external"); + + +/** + * Convert a string that pass as a "binary string": it should represent a byte + * array but may have > 255 char codes. Be sure to take only the first byte + * and returns the byte array. + * @param {String} str the string to transform. + * @return {Array|Uint8Array} the string in a binary format. + */ +function string2binary(str) { + var result = null; + if (support.uint8array) { + result = new Uint8Array(str.length); + } else { + result = new Array(str.length); + } + return stringToArrayLike(str, result); +} + +/** + * Create a new blob with the given content and the given type. + * @param {String|ArrayBuffer} part the content to put in the blob. DO NOT use + * an Uint8Array because the stock browser of android 4 won't accept it (it + * will be silently converted to a string, "[object Uint8Array]"). + * + * Use only ONE part to build the blob to avoid a memory leak in IE11 / Edge: + * when a large amount of Array is used to create the Blob, the amount of + * memory consumed is nearly 100 times the original data amount. + * + * @param {String} type the mime type of the blob. + * @return {Blob} the created blob. + */ +exports.newBlob = function(part, type) { + exports.checkSupport("blob"); + + try { + // Blob constructor + return new Blob([part], { + type: type + }); + } + catch (e) { + + try { + // deprecated, browser only, old way + var Builder = self.BlobBuilder || self.WebKitBlobBuilder || self.MozBlobBuilder || self.MSBlobBuilder; + var builder = new Builder(); + builder.append(part); + return builder.getBlob(type); + } + catch (e) { + + // well, fuck ?! + throw new Error("Bug : can't construct the Blob."); + } + } + + +}; +/** + * The identity function. + * @param {Object} input the input. + * @return {Object} the same input. + */ +function identity(input) { + return input; +} + +/** + * Fill in an array with a string. + * @param {String} str the string to use. + * @param {Array|ArrayBuffer|Uint8Array|Buffer} array the array to fill in (will be mutated). + * @return {Array|ArrayBuffer|Uint8Array|Buffer} the updated array. + */ +function stringToArrayLike(str, array) { + for (var i = 0; i < str.length; ++i) { + array[i] = str.charCodeAt(i) & 0xFF; + } + return array; +} + +/** + * An helper for the function arrayLikeToString. + * This contains static informations and functions that + * can be optimized by the browser JIT compiler. + */ +var arrayToStringHelper = { + /** + * Transform an array of int into a string, chunk by chunk. + * See the performances notes on arrayLikeToString. + * @param {Array|ArrayBuffer|Uint8Array|Buffer} array the array to transform. + * @param {String} type the type of the array. + * @param {Integer} chunk the chunk size. + * @return {String} the resulting string. + * @throws Error if the chunk is too big for the stack. + */ + stringifyByChunk: function(array, type, chunk) { + var result = [], k = 0, len = array.length; + // shortcut + if (len <= chunk) { + return String.fromCharCode.apply(null, array); + } + while (k < len) { + if (type === "array" || type === "nodebuffer") { + result.push(String.fromCharCode.apply(null, array.slice(k, Math.min(k + chunk, len)))); + } + else { + result.push(String.fromCharCode.apply(null, array.subarray(k, Math.min(k + chunk, len)))); + } + k += chunk; + } + return result.join(""); + }, + /** + * Call String.fromCharCode on every item in the array. + * This is the naive implementation, which generate A LOT of intermediate string. + * This should be used when everything else fail. + * @param {Array|ArrayBuffer|Uint8Array|Buffer} array the array to transform. + * @return {String} the result. + */ + stringifyByChar: function(array){ + var resultStr = ""; + for(var i = 0; i < array.length; i++) { + resultStr += String.fromCharCode(array[i]); + } + return resultStr; + }, + applyCanBeUsed : { + /** + * true if the browser accepts to use String.fromCharCode on Uint8Array + */ + uint8array : (function () { + try { + return support.uint8array && String.fromCharCode.apply(null, new Uint8Array(1)).length === 1; + } catch (e) { + return false; + } + })(), + /** + * true if the browser accepts to use String.fromCharCode on nodejs Buffer. + */ + nodebuffer : (function () { + try { + return support.nodebuffer && String.fromCharCode.apply(null, nodejsUtils.allocBuffer(1)).length === 1; + } catch (e) { + return false; + } + })() + } +}; + +/** + * Transform an array-like object to a string. + * @param {Array|ArrayBuffer|Uint8Array|Buffer} array the array to transform. + * @return {String} the result. + */ +function arrayLikeToString(array) { + // Performances notes : + // -------------------- + // String.fromCharCode.apply(null, array) is the fastest, see + // see http://jsperf.com/converting-a-uint8array-to-a-string/2 + // but the stack is limited (and we can get huge arrays !). + // + // result += String.fromCharCode(array[i]); generate too many strings ! + // + // This code is inspired by http://jsperf.com/arraybuffer-to-string-apply-performance/2 + // TODO : we now have workers that split the work. Do we still need that ? + var chunk = 65536, + type = exports.getTypeOf(array), + canUseApply = true; + if (type === "uint8array") { + canUseApply = arrayToStringHelper.applyCanBeUsed.uint8array; + } else if (type === "nodebuffer") { + canUseApply = arrayToStringHelper.applyCanBeUsed.nodebuffer; + } + + if (canUseApply) { + while (chunk > 1) { + try { + return arrayToStringHelper.stringifyByChunk(array, type, chunk); + } catch (e) { + chunk = Math.floor(chunk / 2); + } + } + } + + // no apply or chunk error : slow and painful algorithm + // default browser on android 4.* + return arrayToStringHelper.stringifyByChar(array); +} + +exports.applyFromCharCode = arrayLikeToString; + + +/** + * Copy the data from an array-like to an other array-like. + * @param {Array|ArrayBuffer|Uint8Array|Buffer} arrayFrom the origin array. + * @param {Array|ArrayBuffer|Uint8Array|Buffer} arrayTo the destination array which will be mutated. + * @return {Array|ArrayBuffer|Uint8Array|Buffer} the updated destination array. + */ +function arrayLikeToArrayLike(arrayFrom, arrayTo) { + for (var i = 0; i < arrayFrom.length; i++) { + arrayTo[i] = arrayFrom[i]; + } + return arrayTo; +} + +// a matrix containing functions to transform everything into everything. +var transform = {}; + +// string to ? +transform["string"] = { + "string": identity, + "array": function(input) { + return stringToArrayLike(input, new Array(input.length)); + }, + "arraybuffer": function(input) { + return transform["string"]["uint8array"](input).buffer; + }, + "uint8array": function(input) { + return stringToArrayLike(input, new Uint8Array(input.length)); + }, + "nodebuffer": function(input) { + return stringToArrayLike(input, nodejsUtils.allocBuffer(input.length)); + } +}; + +// array to ? +transform["array"] = { + "string": arrayLikeToString, + "array": identity, + "arraybuffer": function(input) { + return (new Uint8Array(input)).buffer; + }, + "uint8array": function(input) { + return new Uint8Array(input); + }, + "nodebuffer": function(input) { + return nodejsUtils.newBufferFrom(input); + } +}; + +// arraybuffer to ? +transform["arraybuffer"] = { + "string": function(input) { + return arrayLikeToString(new Uint8Array(input)); + }, + "array": function(input) { + return arrayLikeToArrayLike(new Uint8Array(input), new Array(input.byteLength)); + }, + "arraybuffer": identity, + "uint8array": function(input) { + return new Uint8Array(input); + }, + "nodebuffer": function(input) { + return nodejsUtils.newBufferFrom(new Uint8Array(input)); + } +}; + +// uint8array to ? +transform["uint8array"] = { + "string": arrayLikeToString, + "array": function(input) { + return arrayLikeToArrayLike(input, new Array(input.length)); + }, + "arraybuffer": function(input) { + return input.buffer; + }, + "uint8array": identity, + "nodebuffer": function(input) { + return nodejsUtils.newBufferFrom(input); + } +}; + +// nodebuffer to ? +transform["nodebuffer"] = { + "string": arrayLikeToString, + "array": function(input) { + return arrayLikeToArrayLike(input, new Array(input.length)); + }, + "arraybuffer": function(input) { + return transform["nodebuffer"]["uint8array"](input).buffer; + }, + "uint8array": function(input) { + return arrayLikeToArrayLike(input, new Uint8Array(input.length)); + }, + "nodebuffer": identity +}; + +/** + * Transform an input into any type. + * The supported output type are : string, array, uint8array, arraybuffer, nodebuffer. + * If no output type is specified, the unmodified input will be returned. + * @param {String} outputType the output type. + * @param {String|Array|ArrayBuffer|Uint8Array|Buffer} input the input to convert. + * @throws {Error} an Error if the browser doesn't support the requested output type. + */ +exports.transformTo = function(outputType, input) { + if (!input) { + // undefined, null, etc + // an empty string won't harm. + input = ""; + } + if (!outputType) { + return input; + } + exports.checkSupport(outputType); + var inputType = exports.getTypeOf(input); + var result = transform[inputType][outputType](input); + return result; +}; + +/** + * Return the type of the input. + * The type will be in a format valid for JSZip.utils.transformTo : string, array, uint8array, arraybuffer. + * @param {Object} input the input to identify. + * @return {String} the (lowercase) type of the input. + */ +exports.getTypeOf = function(input) { + if (typeof input === "string") { + return "string"; + } + if (Object.prototype.toString.call(input) === "[object Array]") { + return "array"; + } + if (support.nodebuffer && nodejsUtils.isBuffer(input)) { + return "nodebuffer"; + } + if (support.uint8array && input instanceof Uint8Array) { + return "uint8array"; + } + if (support.arraybuffer && input instanceof ArrayBuffer) { + return "arraybuffer"; + } +}; + +/** + * Throw an exception if the type is not supported. + * @param {String} type the type to check. + * @throws {Error} an Error if the browser doesn't support the requested type. + */ +exports.checkSupport = function(type) { + var supported = support[type.toLowerCase()]; + if (!supported) { + throw new Error(type + " is not supported by this platform"); + } +}; + +exports.MAX_VALUE_16BITS = 65535; +exports.MAX_VALUE_32BITS = -1; // well, "\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF" is parsed as -1 + +/** + * Prettify a string read as binary. + * @param {string} str the string to prettify. + * @return {string} a pretty string. + */ +exports.pretty = function(str) { + var res = '', + code, i; + for (i = 0; i < (str || "").length; i++) { + code = str.charCodeAt(i); + res += '\\x' + (code < 16 ? "0" : "") + code.toString(16).toUpperCase(); + } + return res; +}; + +/** + * Defer the call of a function. + * @param {Function} callback the function to call asynchronously. + * @param {Array} args the arguments to give to the callback. + */ +exports.delay = function(callback, args, self) { + setImmediate(function () { + callback.apply(self || null, args || []); + }); +}; + +/** + * Extends a prototype with an other, without calling a constructor with + * side effects. Inspired by nodejs' `utils.inherits` + * @param {Function} ctor the constructor to augment + * @param {Function} superCtor the parent constructor to use + */ +exports.inherits = function (ctor, superCtor) { + var Obj = function() {}; + Obj.prototype = superCtor.prototype; + ctor.prototype = new Obj(); +}; + +/** + * Merge the objects passed as parameters into a new one. + * @private + * @param {...Object} var_args All objects to merge. + * @return {Object} a new object with the data of the others. + */ +exports.extend = function() { + var result = {}, i, attr; + for (i = 0; i < arguments.length; i++) { // arguments is not enumerable in some browsers + for (attr in arguments[i]) { + if (arguments[i].hasOwnProperty(attr) && typeof result[attr] === "undefined") { + result[attr] = arguments[i][attr]; + } + } + } + return result; +}; + +/** + * Transform arbitrary content into a Promise. + * @param {String} name a name for the content being processed. + * @param {Object} inputData the content to process. + * @param {Boolean} isBinary true if the content is not an unicode string + * @param {Boolean} isOptimizedBinaryString true if the string content only has one byte per character. + * @param {Boolean} isBase64 true if the string content is encoded with base64. + * @return {Promise} a promise in a format usable by JSZip. + */ +exports.prepareContent = function(name, inputData, isBinary, isOptimizedBinaryString, isBase64) { + + // if inputData is already a promise, this flatten it. + var promise = external.Promise.resolve(inputData).then(function(data) { + + + var isBlob = support.blob && (data instanceof Blob || ['[object File]', '[object Blob]'].indexOf(Object.prototype.toString.call(data)) !== -1); + + if (isBlob && typeof FileReader !== "undefined") { + return new external.Promise(function (resolve, reject) { + var reader = new FileReader(); + + reader.onload = function(e) { + resolve(e.target.result); + }; + reader.onerror = function(e) { + reject(e.target.error); + }; + reader.readAsArrayBuffer(data); + }); + } else { + return data; + } + }); + + return promise.then(function(data) { + var dataType = exports.getTypeOf(data); + + if (!dataType) { + return external.Promise.reject( + new Error("Can't read the data of '" + name + "'. Is it " + + "in a supported JavaScript type (String, Blob, ArrayBuffer, etc) ?") + ); + } + // special case : it's way easier to work with Uint8Array than with ArrayBuffer + if (dataType === "arraybuffer") { + data = exports.transformTo("uint8array", data); + } else if (dataType === "string") { + if (isBase64) { + data = base64.decode(data); + } + else if (isBinary) { + // optimizedBinaryString === true means that the file has already been filtered with a 0xFF mask + if (isOptimizedBinaryString !== true) { + // this is a string, not in a base64 format. + // Be sure that this is a correct "binary string" + data = string2binary(data); + } + } + } + return data; + }); +}; + +},{"./base64":1,"./external":6,"./nodejsUtils":14,"./support":30,"set-immediate-shim":54}],33:[function(require,module,exports){ +'use strict'; +var readerFor = require('./reader/readerFor'); +var utils = require('./utils'); +var sig = require('./signature'); +var ZipEntry = require('./zipEntry'); +var utf8 = require('./utf8'); +var support = require('./support'); +// class ZipEntries {{{ +/** + * All the entries in the zip file. + * @constructor + * @param {Object} loadOptions Options for loading the stream. + */ +function ZipEntries(loadOptions) { + this.files = []; + this.loadOptions = loadOptions; +} +ZipEntries.prototype = { + /** + * Check that the reader is on the specified signature. + * @param {string} expectedSignature the expected signature. + * @throws {Error} if it is an other signature. + */ + checkSignature: function(expectedSignature) { + if (!this.reader.readAndCheckSignature(expectedSignature)) { + this.reader.index -= 4; + var signature = this.reader.readString(4); + throw new Error("Corrupted zip or bug: unexpected signature " + "(" + utils.pretty(signature) + ", expected " + utils.pretty(expectedSignature) + ")"); + } + }, + /** + * Check if the given signature is at the given index. + * @param {number} askedIndex the index to check. + * @param {string} expectedSignature the signature to expect. + * @return {boolean} true if the signature is here, false otherwise. + */ + isSignature: function(askedIndex, expectedSignature) { + var currentIndex = this.reader.index; + this.reader.setIndex(askedIndex); + var signature = this.reader.readString(4); + var result = signature === expectedSignature; + this.reader.setIndex(currentIndex); + return result; + }, + /** + * Read the end of the central directory. + */ + readBlockEndOfCentral: function() { + this.diskNumber = this.reader.readInt(2); + this.diskWithCentralDirStart = this.reader.readInt(2); + this.centralDirRecordsOnThisDisk = this.reader.readInt(2); + this.centralDirRecords = this.reader.readInt(2); + this.centralDirSize = this.reader.readInt(4); + this.centralDirOffset = this.reader.readInt(4); + + this.zipCommentLength = this.reader.readInt(2); + // warning : the encoding depends of the system locale + // On a linux machine with LANG=en_US.utf8, this field is utf8 encoded. + // On a windows machine, this field is encoded with the localized windows code page. + var zipComment = this.reader.readData(this.zipCommentLength); + var decodeParamType = support.uint8array ? "uint8array" : "array"; + // To get consistent behavior with the generation part, we will assume that + // this is utf8 encoded unless specified otherwise. + var decodeContent = utils.transformTo(decodeParamType, zipComment); + this.zipComment = this.loadOptions.decodeFileName(decodeContent); + }, + /** + * Read the end of the Zip 64 central directory. + * Not merged with the method readEndOfCentral : + * The end of central can coexist with its Zip64 brother, + * I don't want to read the wrong number of bytes ! + */ + readBlockZip64EndOfCentral: function() { + this.zip64EndOfCentralSize = this.reader.readInt(8); + this.reader.skip(4); + // this.versionMadeBy = this.reader.readString(2); + // this.versionNeeded = this.reader.readInt(2); + this.diskNumber = this.reader.readInt(4); + this.diskWithCentralDirStart = this.reader.readInt(4); + this.centralDirRecordsOnThisDisk = this.reader.readInt(8); + this.centralDirRecords = this.reader.readInt(8); + this.centralDirSize = this.reader.readInt(8); + this.centralDirOffset = this.reader.readInt(8); + + this.zip64ExtensibleData = {}; + var extraDataSize = this.zip64EndOfCentralSize - 44, + index = 0, + extraFieldId, + extraFieldLength, + extraFieldValue; + while (index < extraDataSize) { + extraFieldId = this.reader.readInt(2); + extraFieldLength = this.reader.readInt(4); + extraFieldValue = this.reader.readData(extraFieldLength); + this.zip64ExtensibleData[extraFieldId] = { + id: extraFieldId, + length: extraFieldLength, + value: extraFieldValue + }; + } + }, + /** + * Read the end of the Zip 64 central directory locator. + */ + readBlockZip64EndOfCentralLocator: function() { + this.diskWithZip64CentralDirStart = this.reader.readInt(4); + this.relativeOffsetEndOfZip64CentralDir = this.reader.readInt(8); + this.disksCount = this.reader.readInt(4); + if (this.disksCount > 1) { + throw new Error("Multi-volumes zip are not supported"); + } + }, + /** + * Read the local files, based on the offset read in the central part. + */ + readLocalFiles: function() { + var i, file; + for (i = 0; i < this.files.length; i++) { + file = this.files[i]; + this.reader.setIndex(file.localHeaderOffset); + this.checkSignature(sig.LOCAL_FILE_HEADER); + file.readLocalPart(this.reader); + file.handleUTF8(); + file.processAttributes(); + } + }, + /** + * Read the central directory. + */ + readCentralDir: function() { + var file; + + this.reader.setIndex(this.centralDirOffset); + while (this.reader.readAndCheckSignature(sig.CENTRAL_FILE_HEADER)) { + file = new ZipEntry({ + zip64: this.zip64 + }, this.loadOptions); + file.readCentralPart(this.reader); + this.files.push(file); + } + + if (this.centralDirRecords !== this.files.length) { + if (this.centralDirRecords !== 0 && this.files.length === 0) { + // We expected some records but couldn't find ANY. + // This is really suspicious, as if something went wrong. + throw new Error("Corrupted zip or bug: expected " + this.centralDirRecords + " records in central dir, got " + this.files.length); + } else { + // We found some records but not all. + // Something is wrong but we got something for the user: no error here. + // console.warn("expected", this.centralDirRecords, "records in central dir, got", this.files.length); + } + } + }, + /** + * Read the end of central directory. + */ + readEndOfCentral: function() { + var offset = this.reader.lastIndexOfSignature(sig.CENTRAL_DIRECTORY_END); + if (offset < 0) { + // Check if the content is a truncated zip or complete garbage. + // A "LOCAL_FILE_HEADER" is not required at the beginning (auto + // extractible zip for example) but it can give a good hint. + // If an ajax request was used without responseType, we will also + // get unreadable data. + var isGarbage = !this.isSignature(0, sig.LOCAL_FILE_HEADER); + + if (isGarbage) { + throw new Error("Can't find end of central directory : is this a zip file ? " + + "If it is, see https://stuk.github.io/jszip/documentation/howto/read_zip.html"); + } else { + throw new Error("Corrupted zip: can't find end of central directory"); + } + + } + this.reader.setIndex(offset); + var endOfCentralDirOffset = offset; + this.checkSignature(sig.CENTRAL_DIRECTORY_END); + this.readBlockEndOfCentral(); + + + /* extract from the zip spec : + 4) If one of the fields in the end of central directory + record is too small to hold required data, the field + should be set to -1 (0xFFFF or 0xFFFFFFFF) and the + ZIP64 format record should be created. + 5) The end of central directory record and the + Zip64 end of central directory locator record must + reside on the same disk when splitting or spanning + an archive. + */ + if (this.diskNumber === utils.MAX_VALUE_16BITS || this.diskWithCentralDirStart === utils.MAX_VALUE_16BITS || this.centralDirRecordsOnThisDisk === utils.MAX_VALUE_16BITS || this.centralDirRecords === utils.MAX_VALUE_16BITS || this.centralDirSize === utils.MAX_VALUE_32BITS || this.centralDirOffset === utils.MAX_VALUE_32BITS) { + this.zip64 = true; + + /* + Warning : the zip64 extension is supported, but ONLY if the 64bits integer read from + the zip file can fit into a 32bits integer. This cannot be solved : JavaScript represents + all numbers as 64-bit double precision IEEE 754 floating point numbers. + So, we have 53bits for integers and bitwise operations treat everything as 32bits. + see https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Operators/Bitwise_Operators + and http://www.ecma-international.org/publications/files/ECMA-ST/ECMA-262.pdf section 8.5 + */ + + // should look for a zip64 EOCD locator + offset = this.reader.lastIndexOfSignature(sig.ZIP64_CENTRAL_DIRECTORY_LOCATOR); + if (offset < 0) { + throw new Error("Corrupted zip: can't find the ZIP64 end of central directory locator"); + } + this.reader.setIndex(offset); + this.checkSignature(sig.ZIP64_CENTRAL_DIRECTORY_LOCATOR); + this.readBlockZip64EndOfCentralLocator(); + + // now the zip64 EOCD record + if (!this.isSignature(this.relativeOffsetEndOfZip64CentralDir, sig.ZIP64_CENTRAL_DIRECTORY_END)) { + // console.warn("ZIP64 end of central directory not where expected."); + this.relativeOffsetEndOfZip64CentralDir = this.reader.lastIndexOfSignature(sig.ZIP64_CENTRAL_DIRECTORY_END); + if (this.relativeOffsetEndOfZip64CentralDir < 0) { + throw new Error("Corrupted zip: can't find the ZIP64 end of central directory"); + } + } + this.reader.setIndex(this.relativeOffsetEndOfZip64CentralDir); + this.checkSignature(sig.ZIP64_CENTRAL_DIRECTORY_END); + this.readBlockZip64EndOfCentral(); + } + + var expectedEndOfCentralDirOffset = this.centralDirOffset + this.centralDirSize; + if (this.zip64) { + expectedEndOfCentralDirOffset += 20; // end of central dir 64 locator + expectedEndOfCentralDirOffset += 12 /* should not include the leading 12 bytes */ + this.zip64EndOfCentralSize; + } + + var extraBytes = endOfCentralDirOffset - expectedEndOfCentralDirOffset; + + if (extraBytes > 0) { + // console.warn(extraBytes, "extra bytes at beginning or within zipfile"); + if (this.isSignature(endOfCentralDirOffset, sig.CENTRAL_FILE_HEADER)) { + // The offsets seem wrong, but we have something at the specified offset. + // So… we keep it. + } else { + // the offset is wrong, update the "zero" of the reader + // this happens if data has been prepended (crx files for example) + this.reader.zero = extraBytes; + } + } else if (extraBytes < 0) { + throw new Error("Corrupted zip: missing " + Math.abs(extraBytes) + " bytes."); + } + }, + prepareReader: function(data) { + this.reader = readerFor(data); + }, + /** + * Read a zip file and create ZipEntries. + * @param {String|ArrayBuffer|Uint8Array|Buffer} data the binary string representing a zip file. + */ + load: function(data) { + this.prepareReader(data); + this.readEndOfCentral(); + this.readCentralDir(); + this.readLocalFiles(); + } +}; +// }}} end of ZipEntries +module.exports = ZipEntries; + +},{"./reader/readerFor":22,"./signature":23,"./support":30,"./utf8":31,"./utils":32,"./zipEntry":34}],34:[function(require,module,exports){ +'use strict'; +var readerFor = require('./reader/readerFor'); +var utils = require('./utils'); +var CompressedObject = require('./compressedObject'); +var crc32fn = require('./crc32'); +var utf8 = require('./utf8'); +var compressions = require('./compressions'); +var support = require('./support'); + +var MADE_BY_DOS = 0x00; +var MADE_BY_UNIX = 0x03; + +/** + * Find a compression registered in JSZip. + * @param {string} compressionMethod the method magic to find. + * @return {Object|null} the JSZip compression object, null if none found. + */ +var findCompression = function(compressionMethod) { + for (var method in compressions) { + if (!compressions.hasOwnProperty(method)) { + continue; + } + if (compressions[method].magic === compressionMethod) { + return compressions[method]; + } + } + return null; +}; + +// class ZipEntry {{{ +/** + * An entry in the zip file. + * @constructor + * @param {Object} options Options of the current file. + * @param {Object} loadOptions Options for loading the stream. + */ +function ZipEntry(options, loadOptions) { + this.options = options; + this.loadOptions = loadOptions; +} +ZipEntry.prototype = { + /** + * say if the file is encrypted. + * @return {boolean} true if the file is encrypted, false otherwise. + */ + isEncrypted: function() { + // bit 1 is set + return (this.bitFlag & 0x0001) === 0x0001; + }, + /** + * say if the file has utf-8 filename/comment. + * @return {boolean} true if the filename/comment is in utf-8, false otherwise. + */ + useUTF8: function() { + // bit 11 is set + return (this.bitFlag & 0x0800) === 0x0800; + }, + /** + * Read the local part of a zip file and add the info in this object. + * @param {DataReader} reader the reader to use. + */ + readLocalPart: function(reader) { + var compression, localExtraFieldsLength; + + // we already know everything from the central dir ! + // If the central dir data are false, we are doomed. + // On the bright side, the local part is scary : zip64, data descriptors, both, etc. + // The less data we get here, the more reliable this should be. + // Let's skip the whole header and dash to the data ! + reader.skip(22); + // in some zip created on windows, the filename stored in the central dir contains \ instead of /. + // Strangely, the filename here is OK. + // I would love to treat these zip files as corrupted (see http://www.info-zip.org/FAQ.html#backslashes + // or APPNOTE#4.4.17.1, "All slashes MUST be forward slashes '/'") but there are a lot of bad zip generators... + // Search "unzip mismatching "local" filename continuing with "central" filename version" on + // the internet. + // + // I think I see the logic here : the central directory is used to display + // content and the local directory is used to extract the files. Mixing / and \ + // may be used to display \ to windows users and use / when extracting the files. + // Unfortunately, this lead also to some issues : http://seclists.org/fulldisclosure/2009/Sep/394 + this.fileNameLength = reader.readInt(2); + localExtraFieldsLength = reader.readInt(2); // can't be sure this will be the same as the central dir + // the fileName is stored as binary data, the handleUTF8 method will take care of the encoding. + this.fileName = reader.readData(this.fileNameLength); + reader.skip(localExtraFieldsLength); + + if (this.compressedSize === -1 || this.uncompressedSize === -1) { + throw new Error("Bug or corrupted zip : didn't get enough informations from the central directory " + "(compressedSize === -1 || uncompressedSize === -1)"); + } + + compression = findCompression(this.compressionMethod); + if (compression === null) { // no compression found + throw new Error("Corrupted zip : compression " + utils.pretty(this.compressionMethod) + " unknown (inner file : " + utils.transformTo("string", this.fileName) + ")"); + } + this.decompressed = new CompressedObject(this.compressedSize, this.uncompressedSize, this.crc32, compression, reader.readData(this.compressedSize)); + }, + + /** + * Read the central part of a zip file and add the info in this object. + * @param {DataReader} reader the reader to use. + */ + readCentralPart: function(reader) { + this.versionMadeBy = reader.readInt(2); + reader.skip(2); + // this.versionNeeded = reader.readInt(2); + this.bitFlag = reader.readInt(2); + this.compressionMethod = reader.readString(2); + this.date = reader.readDate(); + this.crc32 = reader.readInt(4); + this.compressedSize = reader.readInt(4); + this.uncompressedSize = reader.readInt(4); + var fileNameLength = reader.readInt(2); + this.extraFieldsLength = reader.readInt(2); + this.fileCommentLength = reader.readInt(2); + this.diskNumberStart = reader.readInt(2); + this.internalFileAttributes = reader.readInt(2); + this.externalFileAttributes = reader.readInt(4); + this.localHeaderOffset = reader.readInt(4); + + if (this.isEncrypted()) { + throw new Error("Encrypted zip are not supported"); + } + + // will be read in the local part, see the comments there + reader.skip(fileNameLength); + this.readExtraFields(reader); + this.parseZIP64ExtraField(reader); + this.fileComment = reader.readData(this.fileCommentLength); + }, + + /** + * Parse the external file attributes and get the unix/dos permissions. + */ + processAttributes: function () { + this.unixPermissions = null; + this.dosPermissions = null; + var madeBy = this.versionMadeBy >> 8; + + // Check if we have the DOS directory flag set. + // We look for it in the DOS and UNIX permissions + // but some unknown platform could set it as a compatibility flag. + this.dir = this.externalFileAttributes & 0x0010 ? true : false; + + if(madeBy === MADE_BY_DOS) { + // first 6 bits (0 to 5) + this.dosPermissions = this.externalFileAttributes & 0x3F; + } + + if(madeBy === MADE_BY_UNIX) { + this.unixPermissions = (this.externalFileAttributes >> 16) & 0xFFFF; + // the octal permissions are in (this.unixPermissions & 0x01FF).toString(8); + } + + // fail safe : if the name ends with a / it probably means a folder + if (!this.dir && this.fileNameStr.slice(-1) === '/') { + this.dir = true; + } + }, + + /** + * Parse the ZIP64 extra field and merge the info in the current ZipEntry. + * @param {DataReader} reader the reader to use. + */ + parseZIP64ExtraField: function(reader) { + + if (!this.extraFields[0x0001]) { + return; + } + + // should be something, preparing the extra reader + var extraReader = readerFor(this.extraFields[0x0001].value); + + // I really hope that these 64bits integer can fit in 32 bits integer, because js + // won't let us have more. + if (this.uncompressedSize === utils.MAX_VALUE_32BITS) { + this.uncompressedSize = extraReader.readInt(8); + } + if (this.compressedSize === utils.MAX_VALUE_32BITS) { + this.compressedSize = extraReader.readInt(8); + } + if (this.localHeaderOffset === utils.MAX_VALUE_32BITS) { + this.localHeaderOffset = extraReader.readInt(8); + } + if (this.diskNumberStart === utils.MAX_VALUE_32BITS) { + this.diskNumberStart = extraReader.readInt(4); + } + }, + /** + * Read the central part of a zip file and add the info in this object. + * @param {DataReader} reader the reader to use. + */ + readExtraFields: function(reader) { + var end = reader.index + this.extraFieldsLength, + extraFieldId, + extraFieldLength, + extraFieldValue; + + if (!this.extraFields) { + this.extraFields = {}; + } + + while (reader.index < end) { + extraFieldId = reader.readInt(2); + extraFieldLength = reader.readInt(2); + extraFieldValue = reader.readData(extraFieldLength); + + this.extraFields[extraFieldId] = { + id: extraFieldId, + length: extraFieldLength, + value: extraFieldValue + }; + } + }, + /** + * Apply an UTF8 transformation if needed. + */ + handleUTF8: function() { + var decodeParamType = support.uint8array ? "uint8array" : "array"; + if (this.useUTF8()) { + this.fileNameStr = utf8.utf8decode(this.fileName); + this.fileCommentStr = utf8.utf8decode(this.fileComment); + } else { + var upath = this.findExtraFieldUnicodePath(); + if (upath !== null) { + this.fileNameStr = upath; + } else { + // ASCII text or unsupported code page + var fileNameByteArray = utils.transformTo(decodeParamType, this.fileName); + this.fileNameStr = this.loadOptions.decodeFileName(fileNameByteArray); + } + + var ucomment = this.findExtraFieldUnicodeComment(); + if (ucomment !== null) { + this.fileCommentStr = ucomment; + } else { + // ASCII text or unsupported code page + var commentByteArray = utils.transformTo(decodeParamType, this.fileComment); + this.fileCommentStr = this.loadOptions.decodeFileName(commentByteArray); + } + } + }, + + /** + * Find the unicode path declared in the extra field, if any. + * @return {String} the unicode path, null otherwise. + */ + findExtraFieldUnicodePath: function() { + var upathField = this.extraFields[0x7075]; + if (upathField) { + var extraReader = readerFor(upathField.value); + + // wrong version + if (extraReader.readInt(1) !== 1) { + return null; + } + + // the crc of the filename changed, this field is out of date. + if (crc32fn(this.fileName) !== extraReader.readInt(4)) { + return null; + } + + return utf8.utf8decode(extraReader.readData(upathField.length - 5)); + } + return null; + }, + + /** + * Find the unicode comment declared in the extra field, if any. + * @return {String} the unicode comment, null otherwise. + */ + findExtraFieldUnicodeComment: function() { + var ucommentField = this.extraFields[0x6375]; + if (ucommentField) { + var extraReader = readerFor(ucommentField.value); + + // wrong version + if (extraReader.readInt(1) !== 1) { + return null; + } + + // the crc of the comment changed, this field is out of date. + if (crc32fn(this.fileComment) !== extraReader.readInt(4)) { + return null; + } + + return utf8.utf8decode(extraReader.readData(ucommentField.length - 5)); + } + return null; + } +}; +module.exports = ZipEntry; + +},{"./compressedObject":2,"./compressions":3,"./crc32":4,"./reader/readerFor":22,"./support":30,"./utf8":31,"./utils":32}],35:[function(require,module,exports){ +'use strict'; + +var StreamHelper = require('./stream/StreamHelper'); +var DataWorker = require('./stream/DataWorker'); +var utf8 = require('./utf8'); +var CompressedObject = require('./compressedObject'); +var GenericWorker = require('./stream/GenericWorker'); + +/** + * A simple object representing a file in the zip file. + * @constructor + * @param {string} name the name of the file + * @param {String|ArrayBuffer|Uint8Array|Buffer} data the data + * @param {Object} options the options of the file + */ +var ZipObject = function(name, data, options) { + this.name = name; + this.dir = options.dir; + this.date = options.date; + this.comment = options.comment; + this.unixPermissions = options.unixPermissions; + this.dosPermissions = options.dosPermissions; + + this._data = data; + this._dataBinary = options.binary; + // keep only the compression + this.options = { + compression : options.compression, + compressionOptions : options.compressionOptions + }; +}; + +ZipObject.prototype = { + /** + * Create an internal stream for the content of this object. + * @param {String} type the type of each chunk. + * @return StreamHelper the stream. + */ + internalStream: function (type) { + var result = null, outputType = "string"; + try { + if (!type) { + throw new Error("No output type specified."); + } + outputType = type.toLowerCase(); + var askUnicodeString = outputType === "string" || outputType === "text"; + if (outputType === "binarystring" || outputType === "text") { + outputType = "string"; + } + result = this._decompressWorker(); + + var isUnicodeString = !this._dataBinary; + + if (isUnicodeString && !askUnicodeString) { + result = result.pipe(new utf8.Utf8EncodeWorker()); + } + if (!isUnicodeString && askUnicodeString) { + result = result.pipe(new utf8.Utf8DecodeWorker()); + } + } catch (e) { + result = new GenericWorker("error"); + result.error(e); + } + + return new StreamHelper(result, outputType, ""); + }, + + /** + * Prepare the content in the asked type. + * @param {String} type the type of the result. + * @param {Function} onUpdate a function to call on each internal update. + * @return Promise the promise of the result. + */ + async: function (type, onUpdate) { + return this.internalStream(type).accumulate(onUpdate); + }, + + /** + * Prepare the content as a nodejs stream. + * @param {String} type the type of each chunk. + * @param {Function} onUpdate a function to call on each internal update. + * @return Stream the stream. + */ + nodeStream: function (type, onUpdate) { + return this.internalStream(type || "nodebuffer").toNodejsStream(onUpdate); + }, + + /** + * Return a worker for the compressed content. + * @private + * @param {Object} compression the compression object to use. + * @param {Object} compressionOptions the options to use when compressing. + * @return Worker the worker. + */ + _compressWorker: function (compression, compressionOptions) { + if ( + this._data instanceof CompressedObject && + this._data.compression.magic === compression.magic + ) { + return this._data.getCompressedWorker(); + } else { + var result = this._decompressWorker(); + if(!this._dataBinary) { + result = result.pipe(new utf8.Utf8EncodeWorker()); + } + return CompressedObject.createWorkerFrom(result, compression, compressionOptions); + } + }, + /** + * Return a worker for the decompressed content. + * @private + * @return Worker the worker. + */ + _decompressWorker : function () { + if (this._data instanceof CompressedObject) { + return this._data.getContentWorker(); + } else if (this._data instanceof GenericWorker) { + return this._data; + } else { + return new DataWorker(this._data); + } + } +}; + +var removedMethods = ["asText", "asBinary", "asNodeBuffer", "asUint8Array", "asArrayBuffer"]; +var removedFn = function () { + throw new Error("This method has been removed in JSZip 3.0, please check the upgrade guide."); +}; + +for(var i = 0; i < removedMethods.length; i++) { + ZipObject.prototype[removedMethods[i]] = removedFn; +} +module.exports = ZipObject; + +},{"./compressedObject":2,"./stream/DataWorker":27,"./stream/GenericWorker":28,"./stream/StreamHelper":29,"./utf8":31}],36:[function(require,module,exports){ +(function (global){ +'use strict'; +var Mutation = global.MutationObserver || global.WebKitMutationObserver; + +var scheduleDrain; + +{ + if (Mutation) { + var called = 0; + var observer = new Mutation(nextTick); + var element = global.document.createTextNode(''); + observer.observe(element, { + characterData: true + }); + scheduleDrain = function () { + element.data = (called = ++called % 2); + }; + } else if (!global.setImmediate && typeof global.MessageChannel !== 'undefined') { + var channel = new global.MessageChannel(); + channel.port1.onmessage = nextTick; + scheduleDrain = function () { + channel.port2.postMessage(0); + }; + } else if ('document' in global && 'onreadystatechange' in global.document.createElement('script')) { + scheduleDrain = function () { + + // Create a + + + + + + +
    + +
    +
    + +
    + +

    Interface AccessTokenRetriever

    +
    +
    +
    +
    All Superinterfaces:
    +
    AutoCloseable, Closeable, Initable
    +
    +
    +
    All Known Implementing Classes:
    +
    FileTokenRetriever, HttpAccessTokenRetriever
    +
    +
    +
    public interface AccessTokenRetriever +extends Initable, Closeable
    +
    An AccessTokenRetriever is the internal API by which the login module will + retrieve an access token for use in authorization by the broker. The implementation may + involve authentication to a remote system, or it can be as simple as loading the contents + of a file or configuration setting. + + Retrieval is a separate concern from validation, so it isn't necessary for + the AccessTokenRetriever implementation to validate the integrity of the JWT + access token.
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    +
      + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      +
      default void
      + +
      +
      Lifecycle method to perform a clean shutdown of the retriever.
      +
      + + +
      +
      Retrieves a JWT access token in its serialized three-part form.
      +
      +
      +
      +
      +
      +

      Methods inherited from interface org.apache.kafka.common.security.oauthbearer.secured.Initable

      +init
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        retrieve

        +
        String retrieve() + throws IOException
        +
        Retrieves a JWT access token in its serialized three-part form. The implementation + is free to determine how it should be retrieved but should not perform validation + on the result. + + Note: This is a blocking function and callers should be aware that the + implementation may be communicating over a network, with the file system, coordinating + threads, etc. The facility in the LoginModule from + which this is ultimately called does not provide an asynchronous approach.
        +
        +
        Returns:
        +
        Non-null JWT access token string
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during retrieval
        +
        +
        +
      • +
      • +
        +

        close

        +
        default void close() + throws IOException
        +
        Lifecycle method to perform a clean shutdown of the retriever. This must + be performed by the caller to ensure the correct state, freeing up and releasing any + resources performed in Initable.init().
        +
        +
        Specified by:
        +
        close in interface AutoCloseable
        +
        Specified by:
        +
        close in interface Closeable
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during closure
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/AccessTokenRetrieverFactory.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/AccessTokenRetrieverFactory.html new file mode 100644 index 000000000..2dce70cea --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/AccessTokenRetrieverFactory.html @@ -0,0 +1,186 @@ + + + + +AccessTokenRetrieverFactory (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class AccessTokenRetrieverFactory

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.AccessTokenRetrieverFactory
    +
    +
    +
    +
    public class AccessTokenRetrieverFactory +extends Object
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        AccessTokenRetrieverFactory

        +
        public AccessTokenRetrieverFactory()
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      + +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/AccessTokenValidator.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/AccessTokenValidator.html new file mode 100644 index 000000000..e4bc4c070 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/AccessTokenValidator.html @@ -0,0 +1,172 @@ + + + + +AccessTokenValidator (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Interface AccessTokenValidator

    +
    +
    +
    +
    All Known Implementing Classes:
    +
    LoginAccessTokenValidator, ValidatorAccessTokenValidator
    +
    +
    +
    public interface AccessTokenValidator
    +
    An instance of AccessTokenValidator acts as a function object that, given an access + token in base-64 encoded JWT format, can parse the data, perform validation, and construct an + OAuthBearerToken for use by the caller. + + The primary reason for this abstraction is that client and broker may have different libraries + available to them to perform these operations. Additionally, the exact steps for validation may + differ between implementations. To put this more concretely: the implementation in the Kafka + client does not have bundled a robust library to perform this logic, and it is not the + responsibility of the client to perform vigorous validation. However, the Kafka broker ships with + a richer set of library dependencies that can perform more substantial validation and is also + expected to perform a trust-but-verify test of the access token's signature. + + See: + +
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    +
      + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      + +
      validate(String accessToken)
      +
      +
      Accepts an OAuth JWT access token in base-64 encoded format, validates, and returns an + OAuthBearerToken.
      +
      +
      +
      +
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        validate

        +
        OAuthBearerToken validate(String accessToken) + throws ValidateException
        +
        Accepts an OAuth JWT access token in base-64 encoded format, validates, and returns an + OAuthBearerToken.
        +
        +
        Parameters:
        +
        accessToken - Non-null JWT access token
        +
        Returns:
        +
        OAuthBearerToken
        +
        Throws:
        +
        ValidateException - Thrown on errors performing validation of given token
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/AccessTokenValidatorFactory.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/AccessTokenValidatorFactory.html new file mode 100644 index 000000000..9c1fe0b97 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/AccessTokenValidatorFactory.html @@ -0,0 +1,193 @@ + + + + +AccessTokenValidatorFactory (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class AccessTokenValidatorFactory

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.AccessTokenValidatorFactory
    +
    +
    +
    +
    public class AccessTokenValidatorFactory +extends Object
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        AccessTokenValidatorFactory

        +
        public AccessTokenValidatorFactory()
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      + +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/BasicOAuthBearerToken.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/BasicOAuthBearerToken.html new file mode 100644 index 000000000..bbdd635c6 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/BasicOAuthBearerToken.html @@ -0,0 +1,323 @@ + + + + +BasicOAuthBearerToken (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class BasicOAuthBearerToken

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.BasicOAuthBearerToken
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    OAuthBearerToken
    +
    +
    +
    public class BasicOAuthBearerToken +extends Object +implements OAuthBearerToken
    +
    An implementation of the OAuthBearerToken that fairly straightforwardly stores the values + given to its constructor (except the scope set which is copied to avoid modifications). + + Very little validation is applied here with respect to the validity of the given values. All + validation is assumed to happen by users of this class.
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        BasicOAuthBearerToken

        +
        public BasicOAuthBearerToken(String token, + Set<String> scopes, + long lifetimeMs, + String principalName, + Long startTimeMs)
        +
        Creates a new OAuthBearerToken instance around the given values.
        +
        +
        Parameters:
        +
        token - Value containing the compact serialization as a base 64 string that + can be parsed, decoded, and validated as a well-formed JWS. Must be + non-null, non-blank, and non-whitespace only.
        +
        scopes - Set of non-null scopes. May contain case-sensitive + "duplicates". The given set is copied and made unmodifiable so neither + the caller of this constructor nor any downstream users can modify it.
        +
        lifetimeMs - The token's lifetime, expressed as the number of milliseconds since the + epoch. Must be non-negative.
        +
        principalName - The name of the principal to which this credential applies. Must be + non-null, non-blank, and non-whitespace only.
        +
        startTimeMs - The token's start time, expressed as the number of milliseconds since + the epoch, if available, otherwise null. Must be + non-negative if a non-null value is provided.
        +
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        value

        +
        public String value()
        +
        The b64token value as defined in + RFC 6750 Section + 2.1
        +
        +
        Specified by:
        +
        value in interface OAuthBearerToken
        +
        Returns:
        +
        b64token value as defined in + RFC 6750 + Section 2.1
        +
        +
        +
      • +
      • +
        +

        scope

        +
        public Set<String> scope()
        +
        The token's scope of access, as per + RFC 6749 Section + 1.4
        +
        +
        Specified by:
        +
        scope in interface OAuthBearerToken
        +
        Returns:
        +
        the token's (always non-null but potentially empty) scope of access, + as per RFC + 6749 Section 1.4. Note that all values in the returned set will + be trimmed of preceding and trailing whitespace, and the result will + never contain the empty string.
        +
        +
        +
      • +
      • +
        +

        lifetimeMs

        +
        public long lifetimeMs()
        +
        The token's lifetime, expressed as the number of milliseconds since the + epoch, as per RFC + 6749 Section 1.4
        +
        +
        Specified by:
        +
        lifetimeMs in interface OAuthBearerToken
        +
        Returns:
        +
        the token's lifetime, expressed as the number of milliseconds since + the epoch, as per + RFC 6749 + Section 1.4.
        +
        +
        +
      • +
      • +
        +

        principalName

        +
        public String principalName()
        +
        The name of the principal to which this credential applies
        +
        +
        Specified by:
        +
        principalName in interface OAuthBearerToken
        +
        Returns:
        +
        the always non-null/non-empty principal name
        +
        +
        +
      • +
      • +
        +

        startTimeMs

        +
        public Long startTimeMs()
        +
        When the credential became valid, in terms of the number of milliseconds + since the epoch, if known, otherwise null. An expiring credential may not + necessarily indicate when it was created -- just when it expires -- so we + need to support a null return value here.
        +
        +
        Specified by:
        +
        startTimeMs in interface OAuthBearerToken
        +
        Returns:
        +
        the time when the credential became valid, in terms of the number of + milliseconds since the epoch, if known, otherwise null
        +
        +
        +
      • +
      • +
        +

        toString

        +
        public String toString()
        +
        +
        Overrides:
        +
        toString in class Object
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ClaimValidationUtils.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ClaimValidationUtils.html new file mode 100644 index 000000000..d6f2f17ed --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ClaimValidationUtils.html @@ -0,0 +1,342 @@ + + + + +ClaimValidationUtils (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class ClaimValidationUtils

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.ClaimValidationUtils
    +
    +
    +
    +
    public class ClaimValidationUtils +extends Object
    +
    Simple utility class to perform basic cleaning and validation on input values so that they're + performed consistently throughout the code base.
    +
    +
    +
      + +
    • +
      +

      Constructor Summary

      +
      Constructors
      +
      +
      Constructor
      +
      Description
      + +
       
      +
      +
      +
    • + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      +
      static String
      + +
      +
      Validates that the given claim name override is valid, where invalid means + any of the following: + + + null + Zero length + Whitespace only +
      +
      +
      static long
      +
      validateExpiration(String claimName, + Long claimValue)
      +
      +
      Validates that the given lifetime is valid, where invalid means any of + the following: + + + null + Negative +
      +
      +
      static Long
      +
      validateIssuedAt(String claimName, + Long claimValue)
      +
      +
      Validates that the given issued at claim name is valid, where invalid means any of + the following: + + + Negative +
      +
      +
      static Set<String>
      +
      validateScopes(String scopeClaimName, + Collection<String> scopes)
      +
      +
      Validates that the scopes are valid, where invalid means any of + the following: + + + Collection is null + Collection has duplicates + Any of the elements in the collection are null + Any of the elements in the collection are zero length + Any of the elements in the collection are whitespace only +
      +
      +
      static String
      +
      validateSubject(String claimName, + String claimValue)
      +
      +
      Validates that the given claim value is valid, where invalid means any of + the following: + + + null + Zero length + Whitespace only +
      +
      +
      +
      +
      +
      +

      Methods inherited from class java.lang.Object

      +clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        ClaimValidationUtils

        +
        public ClaimValidationUtils()
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        validateScopes

        +
        public static Set<String> validateScopes(String scopeClaimName, + Collection<String> scopes) + throws ValidateException
        +
        Validates that the scopes are valid, where invalid means any of + the following: + +
          +
        • Collection is null
        • +
        • Collection has duplicates
        • +
        • Any of the elements in the collection are null
        • +
        • Any of the elements in the collection are zero length
        • +
        • Any of the elements in the collection are whitespace only
        • +
        +
        +
        Parameters:
        +
        scopeClaimName - Name of the claim used for the scope values
        +
        scopes - Collection of String scopes
        +
        Returns:
        +
        Unmodifiable Set that includes the values of the original set, but with + each value trimmed
        +
        Throws:
        +
        ValidateException - Thrown if the value is null, contains duplicates, or + if any of the values in the set are null, empty, + or whitespace only
        +
        +
        +
      • +
      • +
        +

        validateExpiration

        +
        public static long validateExpiration(String claimName, + Long claimValue) + throws ValidateException
        +
        Validates that the given lifetime is valid, where invalid means any of + the following: + +
          +
        • null
        • +
        • Negative
        • +
        +
        +
        Parameters:
        +
        claimName - Name of the claim
        +
        claimValue - Expiration time (in milliseconds)
        +
        Returns:
        +
        Input parameter, as provided
        +
        Throws:
        +
        ValidateException - Thrown if the value is null or negative
        +
        +
        +
      • +
      • +
        +

        validateSubject

        +
        public static String validateSubject(String claimName, + String claimValue) + throws ValidateException
        +
        Validates that the given claim value is valid, where invalid means any of + the following: + +
          +
        • null
        • +
        • Zero length
        • +
        • Whitespace only
        • +
        +
        +
        Parameters:
        +
        claimName - Name of the claim
        +
        claimValue - Name of the subject
        +
        Returns:
        +
        Trimmed version of the claimValue parameter
        +
        Throws:
        +
        ValidateException - Thrown if the value is null, empty, or whitespace only
        +
        +
        +
      • +
      • +
        +

        validateIssuedAt

        +
        public static Long validateIssuedAt(String claimName, + Long claimValue) + throws ValidateException
        +
        Validates that the given issued at claim name is valid, where invalid means any of + the following: + +
          +
        • Negative
        • +
        +
        +
        Parameters:
        +
        claimName - Name of the claim
        +
        claimValue - Start time (in milliseconds) or null if not used
        +
        Returns:
        +
        Input parameter, as provided
        +
        Throws:
        +
        ValidateException - Thrown if the value is negative
        +
        +
        +
      • +
      • +
        +

        validateClaimNameOverride

        +
        public static String validateClaimNameOverride(String name, + String value) + throws ValidateException
        +
        Validates that the given claim name override is valid, where invalid means + any of the following: + +
          +
        • null
        • +
        • Zero length
        • +
        • Whitespace only
        • +
        +
        +
        Parameters:
        +
        name - "Standard" name of the claim, e.g. sub
        +
        value - "Override" name of the claim, e.g. email
        +
        Returns:
        +
        Trimmed version of the value parameter
        +
        Throws:
        +
        ValidateException - Thrown if the value is null, empty, or whitespace only
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/CloseableVerificationKeyResolver.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/CloseableVerificationKeyResolver.html new file mode 100644 index 000000000..66fe1268e --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/CloseableVerificationKeyResolver.html @@ -0,0 +1,168 @@ + + + + +CloseableVerificationKeyResolver (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Interface CloseableVerificationKeyResolver

    +
    +
    +
    +
    All Superinterfaces:
    +
    AutoCloseable, Closeable, Initable, org.jose4j.keys.resolvers.VerificationKeyResolver
    +
    +
    +
    All Known Implementing Classes:
    +
    JwksFileVerificationKeyResolver, RefreshingHttpsJwksVerificationKeyResolver
    +
    +
    +
    public interface CloseableVerificationKeyResolver +extends Initable, Closeable, org.jose4j.keys.resolvers.VerificationKeyResolver
    +
    The OAuthBearerValidatorCallbackHandler uses a VerificationKeyResolver as + part of its validation of the incoming JWT. Some of the VerificationKeyResolver + implementations use resources like threads, connections, etc. that should be properly closed + when no longer needed. Since the VerificationKeyResolver interface itself doesn't + define a close method, we provide a means to do that here.
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    +
      + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      +
      default void
      + +
      +
      Lifecycle method to perform a clean shutdown of the VerificationKeyResolver.
      +
      +
      +
      +
      +
      +

      Methods inherited from interface org.apache.kafka.common.security.oauthbearer.secured.Initable

      +init
      +
      +

      Methods inherited from interface org.jose4j.keys.resolvers.VerificationKeyResolver

      +resolveKey
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        close

        +
        default void close() + throws IOException
        +
        Lifecycle method to perform a clean shutdown of the VerificationKeyResolver. + This must be performed by the caller to ensure the correct state, freeing up + and releasing any resources performed in Initable.init().
        +
        +
        Specified by:
        +
        close in interface AutoCloseable
        +
        Specified by:
        +
        close in interface Closeable
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during closure
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ConfigurationUtils.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ConfigurationUtils.html new file mode 100644 index 000000000..853728e1c --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ConfigurationUtils.html @@ -0,0 +1,339 @@ + + + + +ConfigurationUtils (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class ConfigurationUtils

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
    +
    +
    +
    +
    public class ConfigurationUtils +extends Object
    +
    ConfigurationUtils is a utility class to perform basic configuration-related + logic and is separated out here for easier, more direct testing.
    +
    +
    +
      + +
    • +
      +

      Constructor Summary

      +
      Constructors
      +
      +
      Constructor
      +
      Description
      + +
       
      +
      ConfigurationUtils(Map<String,?> configs, + String saslMechanism)
      +
       
      +
      +
      +
    • + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      +
      <T> T
      +
      get(String name)
      +
       
      + + +
      +
      Validates that, if a value is supplied, is a file that: + + + exists + has read permission + points to a file + + + If the value is null or an empty string, it is assumed to be an "empty" value and thus.
      +
      + +
      validateInteger(String name, + boolean isRequired)
      +
      +
      Validates that, if a value is supplied, is a value that: + + + is an Integer + has a value that is not less than the provided minimum value + + + If the value is null or an empty string, it is assumed to be an "empty" value and thus + ignored.
      +
      + + +
      +
      Validates that, if a value is supplied, is a value that: + + + is an Integer + has a value that is not less than the provided minimum value + + + If the value is null or an empty string, it is assumed to be an "empty" value and thus + ignored.
      +
      + +
      validateLong(String name, + boolean isRequired)
      +
       
      + +
      validateLong(String name, + boolean isRequired, + Long min)
      +
       
      + + +
       
      + +
      validateString(String name, + boolean isRequired)
      +
       
      + + +
      +
      Validates that the configured URL that: + + + is well-formed + contains a scheme + uses either HTTP, HTTPS, or file protocols + + + No effort is made to connect to the URL in the validation step.
      +
      +
      +
      +
      +
      +

      Methods inherited from class java.lang.Object

      +clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        ConfigurationUtils

        +
        public ConfigurationUtils(Map<String,?> configs)
        +
        +
      • +
      • +
        +

        ConfigurationUtils

        +
        public ConfigurationUtils(Map<String,?> configs, + String saslMechanism)
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        validateFile

        +
        public Path validateFile(String name)
        +
        Validates that, if a value is supplied, is a file that: + +
      • +
          exists
        +
          has read permission
        +
          points to a file
        +
      • + + If the value is null or an empty string, it is assumed to be an "empty" value and thus. + ignored. Any whitespace is trimmed off of the beginning and end.
        +
        +
      • +
      • +
        +

        validateInteger

        +
        public Integer validateInteger(String name, + boolean isRequired)
        +
        Validates that, if a value is supplied, is a value that: + +
      • +
          is an Integer
        +
          has a value that is not less than the provided minimum value
        +
      • + + If the value is null or an empty string, it is assumed to be an "empty" value and thus + ignored. Any whitespace is trimmed off of the beginning and end.
        +
        +
      • +
      • +
        +

        validateLong

        +
        public Long validateLong(String name)
        +
        Validates that, if a value is supplied, is a value that: + +
      • +
          is an Integer
        +
          has a value that is not less than the provided minimum value
        +
      • + + If the value is null or an empty string, it is assumed to be an "empty" value and thus + ignored. Any whitespace is trimmed off of the beginning and end.
        +
        +
      • +
      • +
        +

        validateLong

        +
        public Long validateLong(String name, + boolean isRequired)
        +
        +
      • +
      • +
        +

        validateLong

        +
        public Long validateLong(String name, + boolean isRequired, + Long min)
        +
        +
      • +
      • +
        +

        validateUrl

        +
        public URL validateUrl(String name)
        +
        Validates that the configured URL that: + +
      • +
          is well-formed
        +
          contains a scheme
        +
          uses either HTTP, HTTPS, or file protocols
        +
      • + + No effort is made to connect to the URL in the validation step.
        +
        +
      • +
      • +
        +

        validateString

        +
        public String validateString(String name) + throws ValidateException
        +
        +
        Throws:
        +
        ValidateException
        +
        +
        +
      • +
      • +
        +

        validateString

        +
        public String validateString(String name, + boolean isRequired) + throws ValidateException
        +
        +
        Throws:
        +
        ValidateException
        +
        +
        +
      • +
      • +
        +

        get

        +
        public <T> T get(String name)
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/FileTokenRetriever.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/FileTokenRetriever.html new file mode 100644 index 000000000..9fb0870b6 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/FileTokenRetriever.html @@ -0,0 +1,217 @@ + + + + +FileTokenRetriever (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class FileTokenRetriever

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.FileTokenRetriever
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    Closeable, AutoCloseable, AccessTokenRetriever, Initable
    +
    +
    +
    public class FileTokenRetriever +extends Object +implements AccessTokenRetriever
    +
    FileTokenRetriever is an AccessTokenRetriever that will load the contents, + interpreting them as a JWT access key in the serialized form.
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        FileTokenRetriever

        +
        public FileTokenRetriever(Path accessTokenFile)
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        init

        +
        public void init() + throws IOException
        +
        Description copied from interface: Initable
        +
        Lifecycle method to perform any one-time initialization of the retriever. This must + be performed by the caller to ensure the correct state before methods are invoked.
        +
        +
        Specified by:
        +
        init in interface Initable
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during initialization
        +
        +
        +
      • +
      • +
        +

        retrieve

        +
        public String retrieve() + throws IOException
        +
        Description copied from interface: AccessTokenRetriever
        +
        Retrieves a JWT access token in its serialized three-part form. The implementation + is free to determine how it should be retrieved but should not perform validation + on the result. + + Note: This is a blocking function and callers should be aware that the + implementation may be communicating over a network, with the file system, coordinating + threads, etc. The facility in the LoginModule from + which this is ultimately called does not provide an asynchronous approach.
        +
        +
        Specified by:
        +
        retrieve in interface AccessTokenRetriever
        +
        Returns:
        +
        Non-null JWT access token string
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during retrieval
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/HttpAccessTokenRetriever.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/HttpAccessTokenRetriever.html new file mode 100644 index 000000000..cb98b0183 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/HttpAccessTokenRetriever.html @@ -0,0 +1,282 @@ + + + + +HttpAccessTokenRetriever (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class HttpAccessTokenRetriever

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.HttpAccessTokenRetriever
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    Closeable, AutoCloseable, AccessTokenRetriever, Initable
    +
    +
    +
    public class HttpAccessTokenRetriever +extends Object +implements AccessTokenRetriever
    +
    HttpAccessTokenRetriever is an AccessTokenRetriever that will + communicate with an OAuth/OIDC provider directly via HTTP to post client credentials + (OAuthBearerLoginCallbackHandler.CLIENT_ID_CONFIG/OAuthBearerLoginCallbackHandler.CLIENT_SECRET_CONFIG) + to a publicized token endpoint URL + (SaslConfigs.SASL_OAUTHBEARER_TOKEN_ENDPOINT_URL).
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Field Details

      + +
      +
    • + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        HttpAccessTokenRetriever

        +
        public HttpAccessTokenRetriever(String clientId, + String clientSecret, + String scope, + SSLSocketFactory sslSocketFactory, + String tokenEndpointUrl, + long loginRetryBackoffMs, + long loginRetryBackoffMaxMs, + Integer loginConnectTimeoutMs, + Integer loginReadTimeoutMs)
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        retrieve

        +
        public String retrieve() + throws IOException
        +
        Retrieves a JWT access token in its serialized three-part form. The implementation + is free to determine how it should be retrieved but should not perform validation + on the result. + + Note: This is a blocking function and callers should be aware that the + implementation communicates over a network. The facility in the + LoginModule from which this is ultimately called + does not provide an asynchronous approach.
        +
        +
        Specified by:
        +
        retrieve in interface AccessTokenRetriever
        +
        Returns:
        +
        Non-null JWT access token string
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during retrieval
        +
        +
        +
      • +
      • +
        +

        post

        +
        public static String post(HttpURLConnection con, + Map<String,String> headers, + String requestBody, + Integer connectTimeoutMs, + Integer readTimeoutMs) + throws IOException, +UnretryableException
        +
        +
        Throws:
        +
        IOException
        +
        UnretryableException
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/Initable.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/Initable.html new file mode 100644 index 000000000..40b4c265f --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/Initable.html @@ -0,0 +1,141 @@ + + + + +Initable (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + + +
    +
    +
    All Known Subinterfaces:
    +
    AccessTokenRetriever, CloseableVerificationKeyResolver
    +
    +
    +
    All Known Implementing Classes:
    +
    FileTokenRetriever, HttpAccessTokenRetriever, JwksFileVerificationKeyResolver, RefreshingHttpsJwks, RefreshingHttpsJwksVerificationKeyResolver
    +
    +
    +
    public interface Initable
    +
    +
    +
      + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      +
      default void
      + +
      +
      Lifecycle method to perform any one-time initialization of the retriever.
      +
      +
      +
      +
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        init

        +
        default void init() + throws IOException
        +
        Lifecycle method to perform any one-time initialization of the retriever. This must + be performed by the caller to ensure the correct state before methods are invoked.
        +
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during initialization
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/JaasOptionsUtils.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/JaasOptionsUtils.html new file mode 100644 index 000000000..a93cd2914 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/JaasOptionsUtils.html @@ -0,0 +1,219 @@ + + + + +JaasOptionsUtils (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class JaasOptionsUtils

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.JaasOptionsUtils
    +
    +
    +
    +
    public class JaasOptionsUtils +extends Object
    +
    JaasOptionsUtils is a utility class to perform logic for the JAAS options and + is separated out here for easier, more direct testing.
    +
    +
    + +
    +
    + +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/JwksFileVerificationKeyResolver.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/JwksFileVerificationKeyResolver.html new file mode 100644 index 000000000..77f9e0b8c --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/JwksFileVerificationKeyResolver.html @@ -0,0 +1,245 @@ + + + + +JwksFileVerificationKeyResolver (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class JwksFileVerificationKeyResolver

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.JwksFileVerificationKeyResolver
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    Closeable, AutoCloseable, CloseableVerificationKeyResolver, Initable, org.jose4j.keys.resolvers.VerificationKeyResolver
    +
    +
    +
    public class JwksFileVerificationKeyResolver +extends Object +implements CloseableVerificationKeyResolver
    +
    JwksFileVerificationKeyResolver is a VerificationKeyResolver implementation + that will load the JWKS from the given file system directory. + + A JWKS (JSON Web Key Set) + is a JSON document provided by the OAuth/OIDC provider that lists the keys used to sign the JWTs + it issues. + + Here is a sample JWKS JSON document: + +
    + {
    +   "keys": [
    +     {
    +       "kty": "RSA",
    +       "alg": "RS256",
    +       "kid": "abc123",
    +       "use": "sig",
    +       "e": "AQAB",
    +       "n": "..."
    +     },
    +     {
    +       "kty": "RSA",
    +       "alg": "RS256",
    +       "kid": "def456",
    +       "use": "sig",
    +       "e": "AQAB",
    +       "n": "..."
    +     }
    +   ]
    + }
    + 
    + + Without going into too much detail, the array of keys enumerates the key data that the provider + is using to sign the JWT. The key ID (kid) is referenced by the JWT's header in + order to match up the JWT's signing key with the key in the JWKS. During the validation step of + the broker, the jose4j OAuth library will use the contents of the appropriate key in the JWKS + to validate the signature. + + Given that the JWKS is referenced by the JWT, the JWKS must be made available by the + OAuth/OIDC provider so that a JWT can be validated.
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        JwksFileVerificationKeyResolver

        +
        public JwksFileVerificationKeyResolver(Path jwksFile)
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        init

        +
        public void init() + throws IOException
        +
        Description copied from interface: Initable
        +
        Lifecycle method to perform any one-time initialization of the retriever. This must + be performed by the caller to ensure the correct state before methods are invoked.
        +
        +
        Specified by:
        +
        init in interface Initable
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during initialization
        +
        +
        +
      • +
      • +
        +

        resolveKey

        +
        public Key resolveKey(org.jose4j.jws.JsonWebSignature jws, + List<org.jose4j.jwx.JsonWebStructure> nestingContext) + throws org.jose4j.lang.UnresolvableKeyException
        +
        +
        Specified by:
        +
        resolveKey in interface org.jose4j.keys.resolvers.VerificationKeyResolver
        +
        Throws:
        +
        org.jose4j.lang.UnresolvableKeyException
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/LoginAccessTokenValidator.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/LoginAccessTokenValidator.html new file mode 100644 index 000000000..550994bf1 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/LoginAccessTokenValidator.html @@ -0,0 +1,260 @@ + + + + +LoginAccessTokenValidator (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class LoginAccessTokenValidator

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.LoginAccessTokenValidator
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    AccessTokenValidator
    +
    +
    +
    public class LoginAccessTokenValidator +extends Object +implements AccessTokenValidator
    +
    LoginAccessTokenValidator is an implementation of AccessTokenValidator that is used + by the client to perform some rudimentary validation of the JWT access token that is received + as part of the response from posting the client credentials to the OAuth/OIDC provider's + token endpoint. + + The validation steps performed are: + +
      +
    1. + Basic structural validation of the b64token value as defined in + RFC 6750 Section 2.1 +
    2. +
    3. Basic conversion of the token into an in-memory map
    4. +
    5. Presence of scope, exp, subject, and iat claims
    6. +
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Field Details

      + +
      +
    • + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        LoginAccessTokenValidator

        +
        public LoginAccessTokenValidator(String scopeClaimName, + String subClaimName)
        +
        Creates a new LoginAccessTokenValidator that will be used by the client for lightweight + validation of the JWT.
        +
        +
        Parameters:
        +
        scopeClaimName - Name of the scope claim to use; must be non-null
        +
        subClaimName - Name of the subject claim to use; must be non-null
        +
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      + +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/OAuthBearerLoginCallbackHandler.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/OAuthBearerLoginCallbackHandler.html new file mode 100644 index 000000000..06a98d114 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/OAuthBearerLoginCallbackHandler.html @@ -0,0 +1,455 @@ + + + + +OAuthBearerLoginCallbackHandler (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class OAuthBearerLoginCallbackHandler

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    CallbackHandler, AuthenticateCallbackHandler
    +
    +
    +
    public class OAuthBearerLoginCallbackHandler +extends Object +implements AuthenticateCallbackHandler
    +

    + OAuthBearerLoginCallbackHandler is an AuthenticateCallbackHandler that + accepts OAuthBearerTokenCallback and SaslExtensionsCallback callbacks to + perform the steps to request a JWT from an OAuth/OIDC provider using the + clientcredentials. This grant type is commonly used for non-interactive + "service accounts" where there is no user available to interactively supply credentials. +

    + +

    + The OAuthBearerLoginCallbackHandler is used on the client side to retrieve a JWT + and the OAuthBearerValidatorCallbackHandler is used on the broker to validate the JWT + that was sent to it by the client to allow access. Both the brokers and clients will need to + be configured with their appropriate callback handlers and respective configuration for OAuth + functionality to work. +

    + +

    + Note that while this callback handler class must be specified for a Kafka client that wants to + use OAuth functionality, in the case of OAuth-based inter-broker communication, the callback + handler must be used on the Kafka broker side as well. + +

    + +

    + This AuthenticateCallbackHandler is enabled by specifying its class name in the Kafka + configuration. For client use, specify the class name in the + SaslConfigs.SASL_LOGIN_CALLBACK_HANDLER_CLASS + configuration like so: + + + sasl.login.callback.handler.class=org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler + +

    + +

    + If using OAuth login on the broker side (for inter-broker communication), the callback handler + class will be specified with a listener-based property: + listener.name..oauthbearer.sasl.login.callback.handler.class like so: + + + listener.name..oauthbearer.sasl.login.callback.handler.class=org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler + +

    + +

    + The Kafka configuration must also include JAAS configuration which includes the following + OAuth-specific options: + +

      +
    • clientIdOAuth client ID (required)
    • +
    • clientSecretOAuth client secret (required)
    • +
    • scopeOAuth scope (optional)
    • +
    +

    + +

    + The JAAS configuration can also include any SSL options that are needed. The configuration + options are the same as those specified by the configuration in + SslConfigs.addClientSslSupport(ConfigDef). +

    + +

    + Here's an example of the JAAS configuration for a Kafka client: + + + sasl.jaas.config=org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required \ + clientId="foo" \ + clientSecret="bar" \ + scope="baz" \ + ssl.protocol="SSL" ; + +

    + +

    + The configuration option + SaslConfigs.SASL_OAUTHBEARER_TOKEN_ENDPOINT_URL + is also required in order for the client to contact the OAuth/OIDC provider. For example: + + + sasl.oauthbearer.token.endpoint.url=https://example.com/oauth2/v1/token + + + Please see the OAuth/OIDC providers documentation for the token endpoint URL. +

    + +

    + The following is a list of all the configuration options that are available for the login + callback handler: + +

    +

    +
    +
    + +
    +
    +
      + +
    • +
      +

      Field Details

      + +
      +
    • + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        OAuthBearerLoginCallbackHandler

        +
        public OAuthBearerLoginCallbackHandler()
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        configure

        +
        public void configure(Map<String,?> configs, + String saslMechanism, + List<AppConfigurationEntry> jaasConfigEntries)
        +
        Description copied from interface: AuthenticateCallbackHandler
        +
        Configures this callback handler for the specified SASL mechanism.
        +
        +
        Specified by:
        +
        configure in interface AuthenticateCallbackHandler
        +
        Parameters:
        +
        configs - Key-value pairs containing the parsed configuration options of + the client or broker. Note that these are the Kafka configuration options + and not the JAAS configuration options. JAAS config options may be obtained + from `jaasConfigEntries` for callbacks which obtain some configs from the + JAAS configuration. For configs that may be specified as both Kafka config + as well as JAAS config (e.g. sasl.kerberos.service.name), the configuration + is treated as invalid if conflicting values are provided.
        +
        saslMechanism - Negotiated SASL mechanism. For clients, this is the SASL + mechanism configured for the client. For brokers, this is the mechanism + negotiated with the client and is one of the mechanisms enabled on the broker.
        +
        jaasConfigEntries - JAAS configuration entries from the JAAS login context. + This list contains a single entry for clients and may contain more than + one entry for brokers if multiple mechanisms are enabled on a listener using + static JAAS configuration where there is no mapping between mechanisms and + login module entries. In this case, callback handlers can use the login module in + `jaasConfigEntries` to identify the entry corresponding to `saslMechanism`. + Alternatively, dynamic JAAS configuration option + SaslConfigs.SASL_JAAS_CONFIG may be + configured on brokers with listener and mechanism prefix, in which case + only the configuration entry corresponding to `saslMechanism` will be provided + in `jaasConfigEntries`.
        +
        +
        +
      • +
      • +
        +

        close

        +
        public void close()
        +
        Description copied from interface: AuthenticateCallbackHandler
        +
        Closes this instance.
        +
        +
        Specified by:
        +
        close in interface AuthenticateCallbackHandler
        +
        +
        +
      • +
      • +
        +

        handle

        +
        public void handle(Callback[] callbacks) + throws IOException, +UnsupportedCallbackException
        +
        +
        Specified by:
        +
        handle in interface CallbackHandler
        +
        Throws:
        +
        IOException
        +
        UnsupportedCallbackException
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/OAuthBearerValidatorCallbackHandler.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/OAuthBearerValidatorCallbackHandler.html new file mode 100644 index 000000000..ad6613e80 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/OAuthBearerValidatorCallbackHandler.html @@ -0,0 +1,288 @@ + + + + +OAuthBearerValidatorCallbackHandler (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class OAuthBearerValidatorCallbackHandler

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerValidatorCallbackHandler
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    CallbackHandler, AuthenticateCallbackHandler
    +
    +
    +
    public class OAuthBearerValidatorCallbackHandler +extends Object +implements AuthenticateCallbackHandler
    +

    + OAuthBearerValidatorCallbackHandler is an AuthenticateCallbackHandler that + accepts OAuthBearerValidatorCallback and OAuthBearerExtensionsValidatorCallback + callbacks to implement OAuth/OIDC validation. This callback handler is intended only to be used + on the Kafka broker side as it will receive a OAuthBearerValidatorCallback that includes + the JWT provided by the Kafka client. That JWT is validated in terms of format, expiration, + signature, and audience and issuer (if desired). This callback handler is the broker side of the + OAuth functionality, whereas OAuthBearerLoginCallbackHandler is used by clients. +

    + +

    + This AuthenticateCallbackHandler is enabled in the broker configuration by setting the + BrokerSecurityConfigs.SASL_SERVER_CALLBACK_HANDLER_CLASS + like so: + + + listener.name..oauthbearer.sasl.server.callback.handler.class=org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerValidatorCallbackHandler + +

    + +

    + The JAAS configuration for OAuth is also needed. If using OAuth for inter-broker communication, + the options are those specified in OAuthBearerLoginCallbackHandler. +

    + +

    + The configuration option + SaslConfigs.SASL_OAUTHBEARER_JWKS_ENDPOINT_URL + is also required in order to contact the OAuth/OIDC provider to retrieve the JWKS for use in + JWT signature validation. For example: + + + listener.name..oauthbearer.sasl.oauthbearer.jwks.endpoint.url=https://example.com/oauth2/v1/keys + + + Please see the OAuth/OIDC providers documentation for the JWKS endpoint URL. +

    + +

    + The following is a list of all the configuration options that are available for the broker + validation callback handler: + +

    +

    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        OAuthBearerValidatorCallbackHandler

        +
        public OAuthBearerValidatorCallbackHandler()
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        configure

        +
        public void configure(Map<String,?> configs, + String saslMechanism, + List<AppConfigurationEntry> jaasConfigEntries)
        +
        Description copied from interface: AuthenticateCallbackHandler
        +
        Configures this callback handler for the specified SASL mechanism.
        +
        +
        Specified by:
        +
        configure in interface AuthenticateCallbackHandler
        +
        Parameters:
        +
        configs - Key-value pairs containing the parsed configuration options of + the client or broker. Note that these are the Kafka configuration options + and not the JAAS configuration options. JAAS config options may be obtained + from `jaasConfigEntries` for callbacks which obtain some configs from the + JAAS configuration. For configs that may be specified as both Kafka config + as well as JAAS config (e.g. sasl.kerberos.service.name), the configuration + is treated as invalid if conflicting values are provided.
        +
        saslMechanism - Negotiated SASL mechanism. For clients, this is the SASL + mechanism configured for the client. For brokers, this is the mechanism + negotiated with the client and is one of the mechanisms enabled on the broker.
        +
        jaasConfigEntries - JAAS configuration entries from the JAAS login context. + This list contains a single entry for clients and may contain more than + one entry for brokers if multiple mechanisms are enabled on a listener using + static JAAS configuration where there is no mapping between mechanisms and + login module entries. In this case, callback handlers can use the login module in + `jaasConfigEntries` to identify the entry corresponding to `saslMechanism`. + Alternatively, dynamic JAAS configuration option + SaslConfigs.SASL_JAAS_CONFIG may be + configured on brokers with listener and mechanism prefix, in which case + only the configuration entry corresponding to `saslMechanism` will be provided + in `jaasConfigEntries`.
        +
        +
        +
      • +
      • +
        +

        close

        +
        public void close()
        +
        Description copied from interface: AuthenticateCallbackHandler
        +
        Closes this instance.
        +
        +
        Specified by:
        +
        close in interface AuthenticateCallbackHandler
        +
        +
        +
      • +
      • +
        +

        handle

        +
        public void handle(Callback[] callbacks) + throws IOException, +UnsupportedCallbackException
        +
        +
        Specified by:
        +
        handle in interface CallbackHandler
        +
        Throws:
        +
        IOException
        +
        UnsupportedCallbackException
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/RefreshingHttpsJwks.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/RefreshingHttpsJwks.html new file mode 100644 index 000000000..fe43202a7 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/RefreshingHttpsJwks.html @@ -0,0 +1,307 @@ + + + + +RefreshingHttpsJwks (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class RefreshingHttpsJwks

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.RefreshingHttpsJwks
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    Closeable, AutoCloseable, Initable
    +
    +
    +
    public final class RefreshingHttpsJwks +extends Object +implements Initable, Closeable
    +
    Implementation of HttpsJwks that will periodically refresh the JWKS cache to reduce or + even prevent HTTP/HTTPS traffic in the hot path of validation. It is assumed that it's + possible to receive a JWT that contains a kid that points to yet-unknown JWK, + thus requiring a connection to the OAuth/OIDC provider to be made. Hopefully, in practice, + keys are made available for some amount of time before they're used within JWTs. + + This instance is created and provided to the + HttpsJwksVerificationKeyResolver that is used when using + an HTTP-/HTTPS-based VerificationKeyResolver, which is then + provided to the ValidatorAccessTokenValidator to use in validating the signature of + a JWT.
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    +
      + +
    • +
      +

      Constructor Summary

      +
      Constructors
      +
      +
      Constructor
      +
      Description
      +
      RefreshingHttpsJwks(org.apache.kafka.common.utils.Time time, + org.jose4j.jwk.HttpsJwks httpsJwks, + long refreshMs, + long refreshRetryBackoffMs, + long refreshRetryBackoffMaxMs)
      +
      +
      Creates a RefreshingHttpsJwks that will be used by the + RefreshingHttpsJwksVerificationKeyResolver to resolve new key IDs in JWTs.
      +
      +
      +
      +
    • + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      +
      void
      + +
       
      +
      List<org.jose4j.jwk.JsonWebKey>
      + +
      +
      Our implementation avoids the blocking call within HttpsJwks.refresh() that is + sometimes called internal to HttpsJwks.getJsonWebKeys().
      +
      + + +
       
      +
      void
      + +
      +
      Lifecycle method to perform any one-time initialization of the retriever.
      +
      +
      boolean
      + +
      +
      + maybeExpediteRefresh is a public method that will trigger a refresh of + the JWKS cache if all of the following conditions are met: + + + The given keyId parameter is <e; the + MISSING_KEY_ID_MAX_KEY_LENGTH + The key isn't in the process of being expedited already +
      +
      +
      +
      +
      +
      +

      Methods inherited from class java.lang.Object

      +clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        RefreshingHttpsJwks

        +
        public RefreshingHttpsJwks(org.apache.kafka.common.utils.Time time, + org.jose4j.jwk.HttpsJwks httpsJwks, + long refreshMs, + long refreshRetryBackoffMs, + long refreshRetryBackoffMaxMs)
        +
        Creates a RefreshingHttpsJwks that will be used by the + RefreshingHttpsJwksVerificationKeyResolver to resolve new key IDs in JWTs.
        +
        +
        Parameters:
        +
        time - Time instance
        +
        httpsJwks - HttpsJwks instance from which to retrieve the JWKS + based on the OAuth/OIDC standard
        +
        refreshMs - The number of milliseconds between refresh passes to connect + to the OAuth/OIDC JWKS endpoint to retrieve the latest set
        +
        refreshRetryBackoffMs - Time for delay after initial failed attempt to retrieve JWKS
        +
        refreshRetryBackoffMaxMs - Maximum time to retrieve JWKS
        +
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        init

        +
        public void init() + throws IOException
        +
        Description copied from interface: Initable
        +
        Lifecycle method to perform any one-time initialization of the retriever. This must + be performed by the caller to ensure the correct state before methods are invoked.
        +
        +
        Specified by:
        +
        init in interface Initable
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during initialization
        +
        +
        +
      • +
      • +
        +

        close

        +
        public void close()
        +
        +
        Specified by:
        +
        close in interface AutoCloseable
        +
        Specified by:
        +
        close in interface Closeable
        +
        +
        +
      • +
      • +
        +

        getJsonWebKeys

        +
        public List<org.jose4j.jwk.JsonWebKey> getJsonWebKeys() + throws org.jose4j.lang.JoseException, +IOException
        +
        Our implementation avoids the blocking call within HttpsJwks.refresh() that is + sometimes called internal to HttpsJwks.getJsonWebKeys(). We want to avoid any + blocking I/O as this code is running in the authentication path on the Kafka network thread. + + The list may be stale up to refreshMs.
        +
        +
        Returns:
        +
        List of JsonWebKey instances
        +
        Throws:
        +
        org.jose4j.lang.JoseException - Thrown if a problem is encountered parsing the JSON content into JWKs
        +
        IOException - Thrown f a problem is encountered making the HTTP request
        +
        +
        +
      • +
      • +
        +

        getLocation

        +
        public String getLocation()
        +
        +
      • +
      • +
        +

        maybeExpediteRefresh

        +
        public boolean maybeExpediteRefresh(String keyId)
        +

        + maybeExpediteRefresh is a public method that will trigger a refresh of + the JWKS cache if all of the following conditions are met: + +

          +
        • The given keyId parameter is <e; the + MISSING_KEY_ID_MAX_KEY_LENGTH
        • +
        • The key isn't in the process of being expedited already
        • +
        + +

        + This expedited refresh is scheduled immediately. +

        +
        +
        Parameters:
        +
        keyId - JWT key ID
        +
        Returns:
        +
        true if an expedited refresh was scheduled, false otherwise
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/RefreshingHttpsJwksVerificationKeyResolver.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/RefreshingHttpsJwksVerificationKeyResolver.html new file mode 100644 index 000000000..ef2e29167 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/RefreshingHttpsJwksVerificationKeyResolver.html @@ -0,0 +1,268 @@ + + + + +RefreshingHttpsJwksVerificationKeyResolver (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class RefreshingHttpsJwksVerificationKeyResolver

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.RefreshingHttpsJwksVerificationKeyResolver
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    Closeable, AutoCloseable, CloseableVerificationKeyResolver, Initable, org.jose4j.keys.resolvers.VerificationKeyResolver
    +
    +
    +
    public class RefreshingHttpsJwksVerificationKeyResolver +extends Object +implements CloseableVerificationKeyResolver
    +
    RefreshingHttpsJwksVerificationKeyResolver is a + VerificationKeyResolver implementation that will periodically refresh the + JWKS using its HttpsJwks instance. + + A JWKS (JSON Web Key Set) + is a JSON document provided by the OAuth/OIDC provider that lists the keys used to sign the JWTs + it issues. + + Here is a sample JWKS JSON document: + +
    + {
    +   "keys": [
    +     {
    +       "kty": "RSA",
    +       "alg": "RS256",
    +       "kid": "abc123",
    +       "use": "sig",
    +       "e": "AQAB",
    +       "n": "..."
    +     },
    +     {
    +       "kty": "RSA",
    +       "alg": "RS256",
    +       "kid": "def456",
    +       "use": "sig",
    +       "e": "AQAB",
    +       "n": "..."
    +     }
    +   ]
    + }
    + 
    + + Without going into too much detail, the array of keys enumerates the key data that the provider + is using to sign the JWT. The key ID (kid) is referenced by the JWT's header in + order to match up the JWT's signing key with the key in the JWKS. During the validation step of + the broker, the jose4j OAuth library will use the contents of the appropriate key in the JWKS + to validate the signature. + + Given that the JWKS is referenced by the JWT, the JWKS must be made available by the + OAuth/OIDC provider so that a JWT can be validated.
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        RefreshingHttpsJwksVerificationKeyResolver

        +
        public RefreshingHttpsJwksVerificationKeyResolver(RefreshingHttpsJwks refreshingHttpsJwks)
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        init

        +
        public void init() + throws IOException
        +
        Description copied from interface: Initable
        +
        Lifecycle method to perform any one-time initialization of the retriever. This must + be performed by the caller to ensure the correct state before methods are invoked.
        +
        +
        Specified by:
        +
        init in interface Initable
        +
        Throws:
        +
        IOException - Thrown on errors related to IO during initialization
        +
        +
        +
      • +
      • +
        +

        close

        +
        public void close()
        +
        Description copied from interface: CloseableVerificationKeyResolver
        +
        Lifecycle method to perform a clean shutdown of the VerificationKeyResolver. + This must be performed by the caller to ensure the correct state, freeing up + and releasing any resources performed in Initable.init().
        +
        +
        Specified by:
        +
        close in interface AutoCloseable
        +
        Specified by:
        +
        close in interface Closeable
        +
        Specified by:
        +
        close in interface CloseableVerificationKeyResolver
        +
        +
        +
      • +
      • +
        +

        resolveKey

        +
        public Key resolveKey(org.jose4j.jws.JsonWebSignature jws, + List<org.jose4j.jwx.JsonWebStructure> nestingContext) + throws org.jose4j.lang.UnresolvableKeyException
        +
        +
        Specified by:
        +
        resolveKey in interface org.jose4j.keys.resolvers.VerificationKeyResolver
        +
        Throws:
        +
        org.jose4j.lang.UnresolvableKeyException
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/Retry.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/Retry.html new file mode 100644 index 000000000..6951c759f --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/Retry.html @@ -0,0 +1,183 @@ + + + + +Retry (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + + +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.Retry<R>
    +
    +
    +
    +
    Type Parameters:
    +
    R - Result type
    +
    +
    +
    public class Retry<R> +extends Object
    +
    Retry encapsulates the mechanism to perform a retry and then exponential + backoff using provided wait times between attempts.
    +
    +
    +
      + +
    • +
      +

      Constructor Summary

      +
      Constructors
      +
      +
      Constructor
      +
      Description
      +
      Retry(long retryBackoffMs, + long retryBackoffMaxMs)
      +
       
      +
      Retry(org.apache.kafka.common.utils.Time time, + long retryBackoffMs, + long retryBackoffMaxMs)
      +
       
      +
      +
      +
    • + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      + +
      execute(Retryable<R> retryable)
      +
       
      +
      +
      +
      +
      +

      Methods inherited from class java.lang.Object

      +clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        Retry

        +
        public Retry(long retryBackoffMs, + long retryBackoffMaxMs)
        +
        +
      • +
      • +
        +

        Retry

        +
        public Retry(org.apache.kafka.common.utils.Time time, + long retryBackoffMs, + long retryBackoffMaxMs)
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      + +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/Retryable.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/Retryable.html new file mode 100644 index 000000000..c27f60d7e --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/Retryable.html @@ -0,0 +1,151 @@ + + + + +Retryable (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Interface Retryable<R>

    +
    +
    +
    +
    Type Parameters:
    +
    R - Result type
    +
    +
    +
    public interface Retryable<R>
    +
    Simple interface to abstract out the call that is made so that it can be retried.
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    +
      + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      + + +
      +
      Perform the operation and return the data from the response.
      +
      +
      +
      +
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        call

        + +
        Perform the operation and return the data from the response.
        +
        +
        Returns:
        +
        Return response data, formatted in the given data type
        +
        Throws:
        +
        ExecutionException - Thrown on errors connecting, writing, reading, timeouts, etc. + that can likely be tried again
        +
        UnretryableException - Thrown on errors that we can determine should not be tried again
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/SerializedJwt.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/SerializedJwt.html new file mode 100644 index 000000000..5d98830c3 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/SerializedJwt.html @@ -0,0 +1,215 @@ + + + + +SerializedJwt (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + + +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.SerializedJwt
    +
    +
    +
    +
    public class SerializedJwt +extends Object
    +
    SerializedJwt provides a modicum of structure and validation around a JWT's serialized form by + splitting and making the three sections (header, payload, and signature) available to the user.
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        SerializedJwt

        +
        public SerializedJwt(String token)
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        getToken

        +
        public String getToken()
        +
        Returns the entire base 64-encoded JWT.
        +
        +
        Returns:
        +
        JWT
        +
        +
        +
      • +
      • +
        +

        getHeader

        +
        public String getHeader()
        +
        Returns the first section--the JWT header--in its base 64-encoded form.
        +
        +
        Returns:
        +
        Header section of the JWT
        +
        +
        +
      • +
      • +
        +

        getPayload

        +
        public String getPayload()
        +
        Returns the second section--the JWT payload--in its base 64-encoded form.
        +
        +
        Returns:
        +
        Payload section of the JWT
        +
        +
        +
      • +
      • +
        +

        getSignature

        +
        public String getSignature()
        +
        Returns the third section--the JWT signature--in its base 64-encoded form.
        +
        +
        Returns:
        +
        Signature section of the JWT
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/UnretryableException.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/UnretryableException.html new file mode 100644 index 000000000..9d18c1152 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/UnretryableException.html @@ -0,0 +1,168 @@ + + + + +UnretryableException (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class UnretryableException

    +
    +
    java.lang.Object +
    java.lang.Throwable +
    java.lang.Exception +
    java.lang.RuntimeException +
    org.apache.kafka.common.KafkaException +
    org.apache.kafka.common.security.oauthbearer.secured.UnretryableException
    +
    +
    +
    +
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    Serializable
    +
    +
    +
    public class UnretryableException +extends KafkaException
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        UnretryableException

        +
        public UnretryableException(String message)
        +
        +
      • +
      • +
        +

        UnretryableException

        +
        public UnretryableException(Throwable cause)
        +
        +
      • +
      • +
        +

        UnretryableException

        +
        public UnretryableException(String message, + Throwable cause)
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ValidateException.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ValidateException.html new file mode 100644 index 000000000..1db923d26 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ValidateException.html @@ -0,0 +1,173 @@ + + + + +ValidateException (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class ValidateException

    +
    +
    java.lang.Object +
    java.lang.Throwable +
    java.lang.Exception +
    java.lang.RuntimeException +
    org.apache.kafka.common.KafkaException +
    org.apache.kafka.common.security.oauthbearer.secured.ValidateException
    +
    +
    +
    +
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    Serializable
    +
    +
    +
    public class ValidateException +extends KafkaException
    +
    ValidateException is thrown in cases where a JWT access token cannot be determined to be + valid for one reason or another. It is intended to be used when errors arise within the + processing of a CallbackHandler.handle(Callback[]). + This error, however, is not thrown from that method directly.
    +
    +
    See Also:
    +
    + +
    +
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        ValidateException

        +
        public ValidateException(String message)
        +
        +
      • +
      • +
        +

        ValidateException

        +
        public ValidateException(Throwable cause)
        +
        +
      • +
      • +
        +

        ValidateException

        +
        public ValidateException(String message, + Throwable cause)
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ValidatorAccessTokenValidator.ClaimSupplier.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ValidatorAccessTokenValidator.ClaimSupplier.html new file mode 100644 index 000000000..57b366d2d --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ValidatorAccessTokenValidator.ClaimSupplier.html @@ -0,0 +1,133 @@ + + + + +ValidatorAccessTokenValidator.ClaimSupplier (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Interface ValidatorAccessTokenValidator.ClaimSupplier<T>

    +
    +
    +
    +
    Enclosing class:
    +
    ValidatorAccessTokenValidator
    +
    +
    +
    public static interface ValidatorAccessTokenValidator.ClaimSupplier<T>
    +
    +
    +
      + +
    • +
      +

      Method Summary

      +
      +
      +
      +
      +
      Modifier and Type
      +
      Method
      +
      Description
      + +
      get()
      +
       
      +
      +
      +
      +
      +
    • +
    +
    +
    +
      + +
    • +
      +

      Method Details

      +
        +
      • +
        +

        get

        +
        T get() +throws org.jose4j.jwt.MalformedClaimException
        +
        +
        Throws:
        +
        org.jose4j.jwt.MalformedClaimException
        +
        +
        +
      • +
      +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ValidatorAccessTokenValidator.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ValidatorAccessTokenValidator.html new file mode 100644 index 000000000..eea173bcb --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/ValidatorAccessTokenValidator.html @@ -0,0 +1,272 @@ + + + + +ValidatorAccessTokenValidator (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class ValidatorAccessTokenValidator

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.ValidatorAccessTokenValidator
    +
    +
    +
    +
    All Implemented Interfaces:
    +
    AccessTokenValidator
    +
    +
    +
    public class ValidatorAccessTokenValidator +extends Object +implements AccessTokenValidator
    +
    ValidatorAccessTokenValidator is an implementation of AccessTokenValidator that is used + by the broker to perform more extensive validation of the JWT access token that is received + from the client, but ultimately from posting the client credentials to the OAuth/OIDC provider's + token endpoint. + + The validation steps performed (primary by the jose4j library) are: + +
      +
    1. + Basic structural validation of the b64token value as defined in + RFC 6750 Section 2.1 +
    2. +
    3. Basic conversion of the token into an in-memory data structure
    4. +
    5. + Presence of scope, exp, subject, iss, and + iat claims +
    6. +
    7. + Signature matching validation against the kid and those provided by + the OAuth/OIDC provider's JWKS +
    8. +
    +
    +
    + +
    +
    +
      + +
    • +
      +

      Constructor Details

      +
        +
      • +
        +

        ValidatorAccessTokenValidator

        +
        public ValidatorAccessTokenValidator(Integer clockSkew, + Set<String> expectedAudiences, + String expectedIssuer, + org.jose4j.keys.resolvers.VerificationKeyResolver verificationKeyResolver, + String scopeClaimName, + String subClaimName)
        +
        Creates a new ValidatorAccessTokenValidator that will be used by the broker for more + thorough validation of the JWT.
        +
        +
        Parameters:
        +
        clockSkew - The optional value (in seconds) to allow for differences + between the time of the OAuth/OIDC identity provider and + the broker. If null is provided, the broker + and the OAUth/OIDC identity provider are assumed to have + very close clock settings.
        +
        expectedAudiences - The (optional) set the broker will use to verify that + the JWT was issued for one of the expected audiences. + The JWT will be inspected for the standard OAuth + aud claim and if this value is set, the + broker will match the value from JWT's aud + claim to see if there is an exact match. If there is no + match, the broker will reject the JWT and authentication + will fail. May be null to not perform any + check to verify the JWT's aud claim matches any + fixed set of known/expected audiences.
        +
        expectedIssuer - The (optional) value for the broker to use to verify that + the JWT was created by the expected issuer. The JWT will + be inspected for the standard OAuth iss claim + and if this value is set, the broker will match it + exactly against what is in the JWT's iss + claim. If there is no match, the broker will reject the JWT + and authentication will fail. May be null to not + perform any check to verify the JWT's iss claim + matches a specific issuer.
        +
        verificationKeyResolver - jose4j-based VerificationKeyResolver that is used + to validate the signature matches the contents of the header + and payload
        +
        scopeClaimName - Name of the scope claim to use; must be non-null
        +
        subClaimName - Name of the subject claim to use; must be + non-null
        +
        See Also:
        +
        +
          +
        • JwtConsumerBuilder
        • +
        • JwtConsumer
        • +
        • VerificationKeyResolver
        • +
        +
        +
        +
        +
      • +
      +
      +
    • + +
    • +
      +

      Method Details

      + +
      +
    • +
    +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/VerificationKeyResolverFactory.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/VerificationKeyResolverFactory.html new file mode 100644 index 000000000..b3538820d --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/VerificationKeyResolverFactory.html @@ -0,0 +1,189 @@ + + + + +VerificationKeyResolverFactory (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    + +
    + +

    Class VerificationKeyResolverFactory

    +
    +
    java.lang.Object +
    org.apache.kafka.common.security.oauthbearer.secured.VerificationKeyResolverFactory
    +
    +
    +
    +
    public class VerificationKeyResolverFactory +extends Object
    +
    +
    + +
    +
    + +
    + +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/package-summary.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/package-summary.html new file mode 100644 index 000000000..c00c5e9c0 --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/package-summary.html @@ -0,0 +1,222 @@ + + + + +org.apache.kafka.common.security.oauthbearer.secured (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    +
    +

    Package org.apache.kafka.common.security.oauthbearer.secured

    +
    +
    +
    package org.apache.kafka.common.security.oauthbearer.secured
    +
    + +
    +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/package-tree.html b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/package-tree.html new file mode 100644 index 000000000..1eceda94b --- /dev/null +++ b/31/javadoc/org/apache/kafka/common/security/oauthbearer/secured/package-tree.html @@ -0,0 +1,135 @@ + + + + +org.apache.kafka.common.security.oauthbearer.secured Class Hierarchy (kafka 3.1.1 API) + + + + + + + + + + + + + + +
    + +
    +
    +
    +

    Hierarchy For Package org.apache.kafka.common.security.oauthbearer.secured

    +Package Hierarchies: + +
    +
    +

    Class Hierarchy

    + +
    +
    +

    Interface Hierarchy

    + +
    +
    +
    +
    + + diff --git a/31/javadoc/org/apache/kafka/common/security/plain/PlainAuthenticateCallback.html b/31/javadoc/org/apache/kafka/common/security/plain/PlainAuthenticateCallback.html index 17663fe0a..980e944ee 100644 --- a/31/javadoc/org/apache/kafka/common/security/plain/PlainAuthenticateCallback.html +++ b/31/javadoc/org/apache/kafka/common/security/plain/PlainAuthenticateCallback.html @@ -2,7 +2,7 @@ -PlainAuthenticateCallback (kafka 3.1.0 API) +PlainAuthenticateCallback (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/plain/PlainLoginModule.html b/31/javadoc/org/apache/kafka/common/security/plain/PlainLoginModule.html index 98c9f9250..fc6a23f13 100644 --- a/31/javadoc/org/apache/kafka/common/security/plain/PlainLoginModule.html +++ b/31/javadoc/org/apache/kafka/common/security/plain/PlainLoginModule.html @@ -2,7 +2,7 @@ -PlainLoginModule (kafka 3.1.0 API) +PlainLoginModule (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/plain/package-summary.html b/31/javadoc/org/apache/kafka/common/security/plain/package-summary.html index 9974d856f..d491b6a0d 100644 --- a/31/javadoc/org/apache/kafka/common/security/plain/package-summary.html +++ b/31/javadoc/org/apache/kafka/common/security/plain/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.common.security.plain (kafka 3.1.0 API) +org.apache.kafka.common.security.plain (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/plain/package-tree.html b/31/javadoc/org/apache/kafka/common/security/plain/package-tree.html index 2d07d7993..8e392be34 100644 --- a/31/javadoc/org/apache/kafka/common/security/plain/package-tree.html +++ b/31/javadoc/org/apache/kafka/common/security/plain/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.common.security.plain Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.common.security.plain Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/scram/ScramCredential.html b/31/javadoc/org/apache/kafka/common/security/scram/ScramCredential.html index d9c818a37..dcffc5fdf 100644 --- a/31/javadoc/org/apache/kafka/common/security/scram/ScramCredential.html +++ b/31/javadoc/org/apache/kafka/common/security/scram/ScramCredential.html @@ -2,7 +2,7 @@ -ScramCredential (kafka 3.1.0 API) +ScramCredential (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/scram/ScramCredentialCallback.html b/31/javadoc/org/apache/kafka/common/security/scram/ScramCredentialCallback.html index 3ccb109e6..7196e24f2 100644 --- a/31/javadoc/org/apache/kafka/common/security/scram/ScramCredentialCallback.html +++ b/31/javadoc/org/apache/kafka/common/security/scram/ScramCredentialCallback.html @@ -2,7 +2,7 @@ -ScramCredentialCallback (kafka 3.1.0 API) +ScramCredentialCallback (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/scram/ScramExtensionsCallback.html b/31/javadoc/org/apache/kafka/common/security/scram/ScramExtensionsCallback.html index 5d6ca5579..51ca768aa 100644 --- a/31/javadoc/org/apache/kafka/common/security/scram/ScramExtensionsCallback.html +++ b/31/javadoc/org/apache/kafka/common/security/scram/ScramExtensionsCallback.html @@ -2,7 +2,7 @@ -ScramExtensionsCallback (kafka 3.1.0 API) +ScramExtensionsCallback (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/scram/ScramLoginModule.html b/31/javadoc/org/apache/kafka/common/security/scram/ScramLoginModule.html index b01d92afc..7f91ad3a6 100644 --- a/31/javadoc/org/apache/kafka/common/security/scram/ScramLoginModule.html +++ b/31/javadoc/org/apache/kafka/common/security/scram/ScramLoginModule.html @@ -2,7 +2,7 @@ -ScramLoginModule (kafka 3.1.0 API) +ScramLoginModule (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/scram/package-summary.html b/31/javadoc/org/apache/kafka/common/security/scram/package-summary.html index bbb959825..ea506ca85 100644 --- a/31/javadoc/org/apache/kafka/common/security/scram/package-summary.html +++ b/31/javadoc/org/apache/kafka/common/security/scram/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.common.security.scram (kafka 3.1.0 API) +org.apache.kafka.common.security.scram (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/scram/package-tree.html b/31/javadoc/org/apache/kafka/common/security/scram/package-tree.html index 6e55a6368..8a5fed405 100644 --- a/31/javadoc/org/apache/kafka/common/security/scram/package-tree.html +++ b/31/javadoc/org/apache/kafka/common/security/scram/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.common.security.scram Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.common.security.scram Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/token/delegation/DelegationToken.html b/31/javadoc/org/apache/kafka/common/security/token/delegation/DelegationToken.html index 3a7c97cf7..df6fa2afd 100644 --- a/31/javadoc/org/apache/kafka/common/security/token/delegation/DelegationToken.html +++ b/31/javadoc/org/apache/kafka/common/security/token/delegation/DelegationToken.html @@ -2,7 +2,7 @@ -DelegationToken (kafka 3.1.0 API) +DelegationToken (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/token/delegation/TokenInformation.html b/31/javadoc/org/apache/kafka/common/security/token/delegation/TokenInformation.html index 39a106f7a..b078be6e5 100644 --- a/31/javadoc/org/apache/kafka/common/security/token/delegation/TokenInformation.html +++ b/31/javadoc/org/apache/kafka/common/security/token/delegation/TokenInformation.html @@ -2,7 +2,7 @@ -TokenInformation (kafka 3.1.0 API) +TokenInformation (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/token/delegation/package-summary.html b/31/javadoc/org/apache/kafka/common/security/token/delegation/package-summary.html index 1d28dd636..ffb4d4dc7 100644 --- a/31/javadoc/org/apache/kafka/common/security/token/delegation/package-summary.html +++ b/31/javadoc/org/apache/kafka/common/security/token/delegation/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.common.security.token.delegation (kafka 3.1.0 API) +org.apache.kafka.common.security.token.delegation (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/security/token/delegation/package-tree.html b/31/javadoc/org/apache/kafka/common/security/token/delegation/package-tree.html index 8eb0dd966..56f611a63 100644 --- a/31/javadoc/org/apache/kafka/common/security/token/delegation/package-tree.html +++ b/31/javadoc/org/apache/kafka/common/security/token/delegation/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.common.security.token.delegation Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.common.security.token.delegation Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/ByteArrayDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/ByteArrayDeserializer.html index a96ff9273..16bda72ad 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/ByteArrayDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/ByteArrayDeserializer.html @@ -2,7 +2,7 @@ -ByteArrayDeserializer (kafka 3.1.0 API) +ByteArrayDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/ByteArraySerializer.html b/31/javadoc/org/apache/kafka/common/serialization/ByteArraySerializer.html index 3310e8d05..0990a3c78 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/ByteArraySerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/ByteArraySerializer.html @@ -2,7 +2,7 @@ -ByteArraySerializer (kafka 3.1.0 API) +ByteArraySerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/ByteBufferDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/ByteBufferDeserializer.html index d71d4ae25..e0b482d62 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/ByteBufferDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/ByteBufferDeserializer.html @@ -2,7 +2,7 @@ -ByteBufferDeserializer (kafka 3.1.0 API) +ByteBufferDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/ByteBufferSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/ByteBufferSerializer.html index 3f935855d..2d297fd8a 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/ByteBufferSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/ByteBufferSerializer.html @@ -2,7 +2,7 @@ -ByteBufferSerializer (kafka 3.1.0 API) +ByteBufferSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/BytesDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/BytesDeserializer.html index f6bb7c822..9919216a8 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/BytesDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/BytesDeserializer.html @@ -2,7 +2,7 @@ -BytesDeserializer (kafka 3.1.0 API) +BytesDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/BytesSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/BytesSerializer.html index 0a961e818..d5f818f83 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/BytesSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/BytesSerializer.html @@ -2,7 +2,7 @@ -BytesSerializer (kafka 3.1.0 API) +BytesSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Deserializer.html b/31/javadoc/org/apache/kafka/common/serialization/Deserializer.html index ca6c08bd6..56a364389 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Deserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Deserializer.html @@ -2,7 +2,7 @@ -Deserializer (kafka 3.1.0 API) +Deserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/DoubleDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/DoubleDeserializer.html index d3a50f726..330d3fc64 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/DoubleDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/DoubleDeserializer.html @@ -2,7 +2,7 @@ -DoubleDeserializer (kafka 3.1.0 API) +DoubleDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/DoubleSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/DoubleSerializer.html index b41d277c4..9585d4416 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/DoubleSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/DoubleSerializer.html @@ -2,7 +2,7 @@ -DoubleSerializer (kafka 3.1.0 API) +DoubleSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/FloatDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/FloatDeserializer.html index 74a96b7d5..7935e36c8 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/FloatDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/FloatDeserializer.html @@ -2,7 +2,7 @@ -FloatDeserializer (kafka 3.1.0 API) +FloatDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/FloatSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/FloatSerializer.html index 55ca84720..fd0c518ee 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/FloatSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/FloatSerializer.html @@ -2,7 +2,7 @@ -FloatSerializer (kafka 3.1.0 API) +FloatSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/IntegerDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/IntegerDeserializer.html index 14c057ee8..bf24c070b 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/IntegerDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/IntegerDeserializer.html @@ -2,7 +2,7 @@ -IntegerDeserializer (kafka 3.1.0 API) +IntegerDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/IntegerSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/IntegerSerializer.html index 731e68ece..3baa8be08 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/IntegerSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/IntegerSerializer.html @@ -2,7 +2,7 @@ -IntegerSerializer (kafka 3.1.0 API) +IntegerSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/ListDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/ListDeserializer.html index 7c58e89cf..6d94d11c6 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/ListDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/ListDeserializer.html @@ -2,7 +2,7 @@ -ListDeserializer (kafka 3.1.0 API) +ListDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/ListSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/ListSerializer.html index b4630a8c7..82ad2761c 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/ListSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/ListSerializer.html @@ -2,7 +2,7 @@ -ListSerializer (kafka 3.1.0 API) +ListSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/LongDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/LongDeserializer.html index 7d8e8925f..d8b60792f 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/LongDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/LongDeserializer.html @@ -2,7 +2,7 @@ -LongDeserializer (kafka 3.1.0 API) +LongDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/LongSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/LongSerializer.html index a4ad3ed5d..e1fd20753 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/LongSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/LongSerializer.html @@ -2,7 +2,7 @@ -LongSerializer (kafka 3.1.0 API) +LongSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serde.html b/31/javadoc/org/apache/kafka/common/serialization/Serde.html index edc31de4a..514b9d117 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serde.html @@ -2,7 +2,7 @@ -Serde (kafka 3.1.0 API) +Serde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.ByteArraySerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.ByteArraySerde.html index f12fe7e63..15af5ad88 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.ByteArraySerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.ByteArraySerde.html @@ -2,7 +2,7 @@ -Serdes.ByteArraySerde (kafka 3.1.0 API) +Serdes.ByteArraySerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.ByteBufferSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.ByteBufferSerde.html index 32c3b4a9a..df87c63bd 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.ByteBufferSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.ByteBufferSerde.html @@ -2,7 +2,7 @@ -Serdes.ByteBufferSerde (kafka 3.1.0 API) +Serdes.ByteBufferSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.BytesSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.BytesSerde.html index fd6be2d98..199a9b2df 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.BytesSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.BytesSerde.html @@ -2,7 +2,7 @@ -Serdes.BytesSerde (kafka 3.1.0 API) +Serdes.BytesSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.DoubleSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.DoubleSerde.html index 4431bc680..a7fab3fa2 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.DoubleSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.DoubleSerde.html @@ -2,7 +2,7 @@ -Serdes.DoubleSerde (kafka 3.1.0 API) +Serdes.DoubleSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.FloatSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.FloatSerde.html index 71a426f34..b9b93d75a 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.FloatSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.FloatSerde.html @@ -2,7 +2,7 @@ -Serdes.FloatSerde (kafka 3.1.0 API) +Serdes.FloatSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.IntegerSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.IntegerSerde.html index 42fb83316..04a6a352c 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.IntegerSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.IntegerSerde.html @@ -2,7 +2,7 @@ -Serdes.IntegerSerde (kafka 3.1.0 API) +Serdes.IntegerSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.ListSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.ListSerde.html index 7e7c6c162..d9607b0c6 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.ListSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.ListSerde.html @@ -2,7 +2,7 @@ -Serdes.ListSerde (kafka 3.1.0 API) +Serdes.ListSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.LongSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.LongSerde.html index a32078959..a1b5fc77e 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.LongSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.LongSerde.html @@ -2,7 +2,7 @@ -Serdes.LongSerde (kafka 3.1.0 API) +Serdes.LongSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.ShortSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.ShortSerde.html index d0cb6c30e..79c6f1d40 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.ShortSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.ShortSerde.html @@ -2,7 +2,7 @@ -Serdes.ShortSerde (kafka 3.1.0 API) +Serdes.ShortSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.StringSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.StringSerde.html index 08bd6e228..d1b36a0db 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.StringSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.StringSerde.html @@ -2,7 +2,7 @@ -Serdes.StringSerde (kafka 3.1.0 API) +Serdes.StringSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.UUIDSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.UUIDSerde.html index 9437ea7dc..8d1bfd9d9 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.UUIDSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.UUIDSerde.html @@ -2,7 +2,7 @@ -Serdes.UUIDSerde (kafka 3.1.0 API) +Serdes.UUIDSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.VoidSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.VoidSerde.html index 2f0b14756..512fbda55 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.VoidSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.VoidSerde.html @@ -2,7 +2,7 @@ -Serdes.VoidSerde (kafka 3.1.0 API) +Serdes.VoidSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.WrapperSerde.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.WrapperSerde.html index 73df2a196..256c6c995 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.WrapperSerde.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.WrapperSerde.html @@ -2,7 +2,7 @@ -Serdes.WrapperSerde (kafka 3.1.0 API) +Serdes.WrapperSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serdes.html b/31/javadoc/org/apache/kafka/common/serialization/Serdes.html index 9efae0c8a..3fea7e6b0 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serdes.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serdes.html @@ -2,7 +2,7 @@ -Serdes (kafka 3.1.0 API) +Serdes (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/Serializer.html b/31/javadoc/org/apache/kafka/common/serialization/Serializer.html index a424a13b2..2b3982cce 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/Serializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/Serializer.html @@ -2,7 +2,7 @@ -Serializer (kafka 3.1.0 API) +Serializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/ShortDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/ShortDeserializer.html index a04dbbd23..5cc397e0e 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/ShortDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/ShortDeserializer.html @@ -2,7 +2,7 @@ -ShortDeserializer (kafka 3.1.0 API) +ShortDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/ShortSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/ShortSerializer.html index 7236761bc..fd549f1bb 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/ShortSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/ShortSerializer.html @@ -2,7 +2,7 @@ -ShortSerializer (kafka 3.1.0 API) +ShortSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/StringDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/StringDeserializer.html index 9ba8c2575..1269969ab 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/StringDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/StringDeserializer.html @@ -2,7 +2,7 @@ -StringDeserializer (kafka 3.1.0 API) +StringDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/StringSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/StringSerializer.html index 2f5ce7295..10c2c6f45 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/StringSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/StringSerializer.html @@ -2,7 +2,7 @@ -StringSerializer (kafka 3.1.0 API) +StringSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/UUIDDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/UUIDDeserializer.html index be6eb7a35..e29da71cc 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/UUIDDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/UUIDDeserializer.html @@ -2,7 +2,7 @@ -UUIDDeserializer (kafka 3.1.0 API) +UUIDDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/UUIDSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/UUIDSerializer.html index 0c8b89eea..d5f69c5a7 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/UUIDSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/UUIDSerializer.html @@ -2,7 +2,7 @@ -UUIDSerializer (kafka 3.1.0 API) +UUIDSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/VoidDeserializer.html b/31/javadoc/org/apache/kafka/common/serialization/VoidDeserializer.html index f8f36b0fd..0ac35ac2d 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/VoidDeserializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/VoidDeserializer.html @@ -2,7 +2,7 @@ -VoidDeserializer (kafka 3.1.0 API) +VoidDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/VoidSerializer.html b/31/javadoc/org/apache/kafka/common/serialization/VoidSerializer.html index 2983f7542..6ecc9f78e 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/VoidSerializer.html +++ b/31/javadoc/org/apache/kafka/common/serialization/VoidSerializer.html @@ -2,7 +2,7 @@ -VoidSerializer (kafka 3.1.0 API) +VoidSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/package-summary.html b/31/javadoc/org/apache/kafka/common/serialization/package-summary.html index 6d6d5c01d..f918fd7fd 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/package-summary.html +++ b/31/javadoc/org/apache/kafka/common/serialization/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.common.serialization (kafka 3.1.0 API) +org.apache.kafka.common.serialization (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/common/serialization/package-tree.html b/31/javadoc/org/apache/kafka/common/serialization/package-tree.html index 0b9d9000b..400e328ce 100644 --- a/31/javadoc/org/apache/kafka/common/serialization/package-tree.html +++ b/31/javadoc/org/apache/kafka/common/serialization/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.common.serialization Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.common.serialization Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/components/Versioned.html b/31/javadoc/org/apache/kafka/connect/components/Versioned.html index 721251d9f..49a195220 100644 --- a/31/javadoc/org/apache/kafka/connect/components/Versioned.html +++ b/31/javadoc/org/apache/kafka/connect/components/Versioned.html @@ -2,7 +2,7 @@ -Versioned (kafka 3.1.0 API) +Versioned (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/components/package-summary.html b/31/javadoc/org/apache/kafka/connect/components/package-summary.html index bbf286bda..503c3d0ff 100644 --- a/31/javadoc/org/apache/kafka/connect/components/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/components/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.components (kafka 3.1.0 API) +org.apache.kafka.connect.components (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/components/package-tree.html b/31/javadoc/org/apache/kafka/connect/components/package-tree.html index 53616868e..810cd5daa 100644 --- a/31/javadoc/org/apache/kafka/connect/components/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/components/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.components Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.components Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/ConnectRecord.html b/31/javadoc/org/apache/kafka/connect/connector/ConnectRecord.html index 25ecbe2ae..2f981afe3 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/ConnectRecord.html +++ b/31/javadoc/org/apache/kafka/connect/connector/ConnectRecord.html @@ -2,7 +2,7 @@ -ConnectRecord (kafka 3.1.0 API) +ConnectRecord (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/Connector.html b/31/javadoc/org/apache/kafka/connect/connector/Connector.html index d4d127b29..36cfa4a64 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/Connector.html +++ b/31/javadoc/org/apache/kafka/connect/connector/Connector.html @@ -2,7 +2,7 @@ -Connector (kafka 3.1.0 API) +Connector (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/ConnectorContext.html b/31/javadoc/org/apache/kafka/connect/connector/ConnectorContext.html index 97cd5cdb6..0eb06cec9 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/ConnectorContext.html +++ b/31/javadoc/org/apache/kafka/connect/connector/ConnectorContext.html @@ -2,7 +2,7 @@ -ConnectorContext (kafka 3.1.0 API) +ConnectorContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/Task.html b/31/javadoc/org/apache/kafka/connect/connector/Task.html index e923434cb..a2b1d7b09 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/Task.html +++ b/31/javadoc/org/apache/kafka/connect/connector/Task.html @@ -2,7 +2,7 @@ -Task (kafka 3.1.0 API) +Task (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/package-summary.html b/31/javadoc/org/apache/kafka/connect/connector/package-summary.html index 14e462b26..570f49ef5 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/connector/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.connector (kafka 3.1.0 API) +org.apache.kafka.connect.connector (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/package-tree.html b/31/javadoc/org/apache/kafka/connect/connector/package-tree.html index 51a9e973e..05d839f90 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/connector/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.connector Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.connector Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigOverridePolicy.html b/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigOverridePolicy.html index 97d5aca15..2ddf95488 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigOverridePolicy.html +++ b/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigOverridePolicy.html @@ -2,7 +2,7 @@ -ConnectorClientConfigOverridePolicy (kafka 3.1.0 API) +ConnectorClientConfigOverridePolicy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigRequest.ClientType.html b/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigRequest.ClientType.html index 488f3aafd..825345b9c 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigRequest.ClientType.html +++ b/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigRequest.ClientType.html @@ -2,7 +2,7 @@ -ConnectorClientConfigRequest.ClientType (kafka 3.1.0 API) +ConnectorClientConfigRequest.ClientType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigRequest.html b/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigRequest.html index 50a204e36..63db16fab 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigRequest.html +++ b/31/javadoc/org/apache/kafka/connect/connector/policy/ConnectorClientConfigRequest.html @@ -2,7 +2,7 @@ -ConnectorClientConfigRequest (kafka 3.1.0 API) +ConnectorClientConfigRequest (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/policy/package-summary.html b/31/javadoc/org/apache/kafka/connect/connector/policy/package-summary.html index c5801fdb0..6c904bca6 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/policy/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/connector/policy/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.connector.policy (kafka 3.1.0 API) +org.apache.kafka.connect.connector.policy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/connector/policy/package-tree.html b/31/javadoc/org/apache/kafka/connect/connector/policy/package-tree.html index b876bb281..1414e6e1a 100644 --- a/31/javadoc/org/apache/kafka/connect/connector/policy/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/connector/policy/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.connector.policy Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.connector.policy Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/ConnectSchema.html b/31/javadoc/org/apache/kafka/connect/data/ConnectSchema.html index 9ec760d68..8a037571b 100644 --- a/31/javadoc/org/apache/kafka/connect/data/ConnectSchema.html +++ b/31/javadoc/org/apache/kafka/connect/data/ConnectSchema.html @@ -2,7 +2,7 @@ -ConnectSchema (kafka 3.1.0 API) +ConnectSchema (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Date.html b/31/javadoc/org/apache/kafka/connect/data/Date.html index cad322c92..12d352e37 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Date.html +++ b/31/javadoc/org/apache/kafka/connect/data/Date.html @@ -2,7 +2,7 @@ -Date (kafka 3.1.0 API) +Date (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Decimal.html b/31/javadoc/org/apache/kafka/connect/data/Decimal.html index 1c4809802..b8883ddbc 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Decimal.html +++ b/31/javadoc/org/apache/kafka/connect/data/Decimal.html @@ -2,7 +2,7 @@ -Decimal (kafka 3.1.0 API) +Decimal (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Field.html b/31/javadoc/org/apache/kafka/connect/data/Field.html index 8de32a157..5605e22fd 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Field.html +++ b/31/javadoc/org/apache/kafka/connect/data/Field.html @@ -2,7 +2,7 @@ -Field (kafka 3.1.0 API) +Field (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Schema.Type.html b/31/javadoc/org/apache/kafka/connect/data/Schema.Type.html index 894090cd6..f27daddc5 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Schema.Type.html +++ b/31/javadoc/org/apache/kafka/connect/data/Schema.Type.html @@ -2,7 +2,7 @@ -Schema.Type (kafka 3.1.0 API) +Schema.Type (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Schema.html b/31/javadoc/org/apache/kafka/connect/data/Schema.html index e8d9a7499..82de3f3a1 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Schema.html +++ b/31/javadoc/org/apache/kafka/connect/data/Schema.html @@ -2,7 +2,7 @@ -Schema (kafka 3.1.0 API) +Schema (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/SchemaAndValue.html b/31/javadoc/org/apache/kafka/connect/data/SchemaAndValue.html index dcb0237fc..d81ea8458 100644 --- a/31/javadoc/org/apache/kafka/connect/data/SchemaAndValue.html +++ b/31/javadoc/org/apache/kafka/connect/data/SchemaAndValue.html @@ -2,7 +2,7 @@ -SchemaAndValue (kafka 3.1.0 API) +SchemaAndValue (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/SchemaBuilder.html b/31/javadoc/org/apache/kafka/connect/data/SchemaBuilder.html index 70a6e4450..11a168ea7 100644 --- a/31/javadoc/org/apache/kafka/connect/data/SchemaBuilder.html +++ b/31/javadoc/org/apache/kafka/connect/data/SchemaBuilder.html @@ -2,7 +2,7 @@ -SchemaBuilder (kafka 3.1.0 API) +SchemaBuilder (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/SchemaProjector.html b/31/javadoc/org/apache/kafka/connect/data/SchemaProjector.html index 354abab41..be75f30ba 100644 --- a/31/javadoc/org/apache/kafka/connect/data/SchemaProjector.html +++ b/31/javadoc/org/apache/kafka/connect/data/SchemaProjector.html @@ -2,7 +2,7 @@ -SchemaProjector (kafka 3.1.0 API) +SchemaProjector (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Struct.html b/31/javadoc/org/apache/kafka/connect/data/Struct.html index 8023b9019..123dcbcd4 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Struct.html +++ b/31/javadoc/org/apache/kafka/connect/data/Struct.html @@ -2,7 +2,7 @@ -Struct (kafka 3.1.0 API) +Struct (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Time.html b/31/javadoc/org/apache/kafka/connect/data/Time.html index 850621f9a..d162d5604 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Time.html +++ b/31/javadoc/org/apache/kafka/connect/data/Time.html @@ -2,7 +2,7 @@ -Time (kafka 3.1.0 API) +Time (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Timestamp.html b/31/javadoc/org/apache/kafka/connect/data/Timestamp.html index d2ef8acaa..d6e017d7b 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Timestamp.html +++ b/31/javadoc/org/apache/kafka/connect/data/Timestamp.html @@ -2,7 +2,7 @@ -Timestamp (kafka 3.1.0 API) +Timestamp (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Values.Parser.html b/31/javadoc/org/apache/kafka/connect/data/Values.Parser.html index 72d90831e..08dba1a29 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Values.Parser.html +++ b/31/javadoc/org/apache/kafka/connect/data/Values.Parser.html @@ -2,7 +2,7 @@ -Values.Parser (kafka 3.1.0 API) +Values.Parser (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Values.SchemaDetector.html b/31/javadoc/org/apache/kafka/connect/data/Values.SchemaDetector.html index baa067fd8..90f285658 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Values.SchemaDetector.html +++ b/31/javadoc/org/apache/kafka/connect/data/Values.SchemaDetector.html @@ -2,7 +2,7 @@ -Values.SchemaDetector (kafka 3.1.0 API) +Values.SchemaDetector (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/Values.html b/31/javadoc/org/apache/kafka/connect/data/Values.html index 5c0e39462..7afae6b8c 100644 --- a/31/javadoc/org/apache/kafka/connect/data/Values.html +++ b/31/javadoc/org/apache/kafka/connect/data/Values.html @@ -2,7 +2,7 @@ -Values (kafka 3.1.0 API) +Values (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/package-summary.html b/31/javadoc/org/apache/kafka/connect/data/package-summary.html index 25156d8c4..8612e599b 100644 --- a/31/javadoc/org/apache/kafka/connect/data/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/data/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.data (kafka 3.1.0 API) +org.apache.kafka.connect.data (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/data/package-tree.html b/31/javadoc/org/apache/kafka/connect/data/package-tree.html index 0aebba460..1e4b2ed12 100644 --- a/31/javadoc/org/apache/kafka/connect/data/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/data/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.data Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.data Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/AlreadyExistsException.html b/31/javadoc/org/apache/kafka/connect/errors/AlreadyExistsException.html index 6a7968295..bfa650ffb 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/AlreadyExistsException.html +++ b/31/javadoc/org/apache/kafka/connect/errors/AlreadyExistsException.html @@ -2,7 +2,7 @@ -AlreadyExistsException (kafka 3.1.0 API) +AlreadyExistsException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/ConnectException.html b/31/javadoc/org/apache/kafka/connect/errors/ConnectException.html index b9a407651..b8104c4ed 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/ConnectException.html +++ b/31/javadoc/org/apache/kafka/connect/errors/ConnectException.html @@ -2,7 +2,7 @@ -ConnectException (kafka 3.1.0 API) +ConnectException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/DataException.html b/31/javadoc/org/apache/kafka/connect/errors/DataException.html index 476ce6948..98451cf2a 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/DataException.html +++ b/31/javadoc/org/apache/kafka/connect/errors/DataException.html @@ -2,7 +2,7 @@ -DataException (kafka 3.1.0 API) +DataException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/IllegalWorkerStateException.html b/31/javadoc/org/apache/kafka/connect/errors/IllegalWorkerStateException.html index add195757..41f393bae 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/IllegalWorkerStateException.html +++ b/31/javadoc/org/apache/kafka/connect/errors/IllegalWorkerStateException.html @@ -2,7 +2,7 @@ -IllegalWorkerStateException (kafka 3.1.0 API) +IllegalWorkerStateException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/NotFoundException.html b/31/javadoc/org/apache/kafka/connect/errors/NotFoundException.html index bca1d120c..178f588de 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/NotFoundException.html +++ b/31/javadoc/org/apache/kafka/connect/errors/NotFoundException.html @@ -2,7 +2,7 @@ -NotFoundException (kafka 3.1.0 API) +NotFoundException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/RetriableException.html b/31/javadoc/org/apache/kafka/connect/errors/RetriableException.html index a0e1072b6..862989ba7 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/RetriableException.html +++ b/31/javadoc/org/apache/kafka/connect/errors/RetriableException.html @@ -2,7 +2,7 @@ -RetriableException (kafka 3.1.0 API) +RetriableException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/SchemaBuilderException.html b/31/javadoc/org/apache/kafka/connect/errors/SchemaBuilderException.html index 0f152cb7c..316a08f4b 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/SchemaBuilderException.html +++ b/31/javadoc/org/apache/kafka/connect/errors/SchemaBuilderException.html @@ -2,7 +2,7 @@ -SchemaBuilderException (kafka 3.1.0 API) +SchemaBuilderException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/SchemaProjectorException.html b/31/javadoc/org/apache/kafka/connect/errors/SchemaProjectorException.html index 95eeadc57..8c56928e4 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/SchemaProjectorException.html +++ b/31/javadoc/org/apache/kafka/connect/errors/SchemaProjectorException.html @@ -2,7 +2,7 @@ -SchemaProjectorException (kafka 3.1.0 API) +SchemaProjectorException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/package-summary.html b/31/javadoc/org/apache/kafka/connect/errors/package-summary.html index 7347e5383..b6af33f01 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/errors/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.errors (kafka 3.1.0 API) +org.apache.kafka.connect.errors (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/errors/package-tree.html b/31/javadoc/org/apache/kafka/connect/errors/package-tree.html index 00e29b1cf..d2499ee39 100644 --- a/31/javadoc/org/apache/kafka/connect/errors/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/errors/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.errors Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.errors Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/header/ConnectHeaders.html b/31/javadoc/org/apache/kafka/connect/header/ConnectHeaders.html index 5ef22f7c9..705da7fd6 100644 --- a/31/javadoc/org/apache/kafka/connect/header/ConnectHeaders.html +++ b/31/javadoc/org/apache/kafka/connect/header/ConnectHeaders.html @@ -2,7 +2,7 @@ -ConnectHeaders (kafka 3.1.0 API) +ConnectHeaders (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/header/Header.html b/31/javadoc/org/apache/kafka/connect/header/Header.html index 1bf6fcc84..296455c16 100644 --- a/31/javadoc/org/apache/kafka/connect/header/Header.html +++ b/31/javadoc/org/apache/kafka/connect/header/Header.html @@ -2,7 +2,7 @@ -Header (kafka 3.1.0 API) +Header (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/header/Headers.HeaderTransform.html b/31/javadoc/org/apache/kafka/connect/header/Headers.HeaderTransform.html index 5b28b509d..8c7b7ba69 100644 --- a/31/javadoc/org/apache/kafka/connect/header/Headers.HeaderTransform.html +++ b/31/javadoc/org/apache/kafka/connect/header/Headers.HeaderTransform.html @@ -2,7 +2,7 @@ -Headers.HeaderTransform (kafka 3.1.0 API) +Headers.HeaderTransform (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/header/Headers.html b/31/javadoc/org/apache/kafka/connect/header/Headers.html index 55a13cd96..7d62cbd47 100644 --- a/31/javadoc/org/apache/kafka/connect/header/Headers.html +++ b/31/javadoc/org/apache/kafka/connect/header/Headers.html @@ -2,7 +2,7 @@ -Headers (kafka 3.1.0 API) +Headers (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/header/package-summary.html b/31/javadoc/org/apache/kafka/connect/header/package-summary.html index ced1754a8..270415ea5 100644 --- a/31/javadoc/org/apache/kafka/connect/header/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/header/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.header (kafka 3.1.0 API) +org.apache.kafka.connect.header (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/header/package-tree.html b/31/javadoc/org/apache/kafka/connect/header/package-tree.html index 63f3c609a..2d99d4e0e 100644 --- a/31/javadoc/org/apache/kafka/connect/header/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/header/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.header Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.header Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/health/AbstractState.html b/31/javadoc/org/apache/kafka/connect/health/AbstractState.html index 590454f90..5c08fa15a 100644 --- a/31/javadoc/org/apache/kafka/connect/health/AbstractState.html +++ b/31/javadoc/org/apache/kafka/connect/health/AbstractState.html @@ -2,7 +2,7 @@ -AbstractState (kafka 3.1.0 API) +AbstractState (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/health/ConnectClusterDetails.html b/31/javadoc/org/apache/kafka/connect/health/ConnectClusterDetails.html index a58da8c07..6ec795379 100644 --- a/31/javadoc/org/apache/kafka/connect/health/ConnectClusterDetails.html +++ b/31/javadoc/org/apache/kafka/connect/health/ConnectClusterDetails.html @@ -2,7 +2,7 @@ -ConnectClusterDetails (kafka 3.1.0 API) +ConnectClusterDetails (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/health/ConnectClusterState.html b/31/javadoc/org/apache/kafka/connect/health/ConnectClusterState.html index 771cb071c..ffd567940 100644 --- a/31/javadoc/org/apache/kafka/connect/health/ConnectClusterState.html +++ b/31/javadoc/org/apache/kafka/connect/health/ConnectClusterState.html @@ -2,7 +2,7 @@ -ConnectClusterState (kafka 3.1.0 API) +ConnectClusterState (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/health/ConnectorHealth.html b/31/javadoc/org/apache/kafka/connect/health/ConnectorHealth.html index 9940d08be..f29704b59 100644 --- a/31/javadoc/org/apache/kafka/connect/health/ConnectorHealth.html +++ b/31/javadoc/org/apache/kafka/connect/health/ConnectorHealth.html @@ -2,7 +2,7 @@ -ConnectorHealth (kafka 3.1.0 API) +ConnectorHealth (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/health/ConnectorState.html b/31/javadoc/org/apache/kafka/connect/health/ConnectorState.html index 187f4ed8f..f69de5ab3 100644 --- a/31/javadoc/org/apache/kafka/connect/health/ConnectorState.html +++ b/31/javadoc/org/apache/kafka/connect/health/ConnectorState.html @@ -2,7 +2,7 @@ -ConnectorState (kafka 3.1.0 API) +ConnectorState (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/health/ConnectorType.html b/31/javadoc/org/apache/kafka/connect/health/ConnectorType.html index f1abddb35..8c73b8b71 100644 --- a/31/javadoc/org/apache/kafka/connect/health/ConnectorType.html +++ b/31/javadoc/org/apache/kafka/connect/health/ConnectorType.html @@ -2,7 +2,7 @@ -ConnectorType (kafka 3.1.0 API) +ConnectorType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/health/TaskState.html b/31/javadoc/org/apache/kafka/connect/health/TaskState.html index 4435a9391..714c0f87d 100644 --- a/31/javadoc/org/apache/kafka/connect/health/TaskState.html +++ b/31/javadoc/org/apache/kafka/connect/health/TaskState.html @@ -2,7 +2,7 @@ -TaskState (kafka 3.1.0 API) +TaskState (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/health/package-summary.html b/31/javadoc/org/apache/kafka/connect/health/package-summary.html index c7518e6d8..88f836479 100644 --- a/31/javadoc/org/apache/kafka/connect/health/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/health/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.health (kafka 3.1.0 API) +org.apache.kafka.connect.health (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/health/package-tree.html b/31/javadoc/org/apache/kafka/connect/health/package-tree.html index 4ad7ef2d7..4b15659ef 100644 --- a/31/javadoc/org/apache/kafka/connect/health/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/health/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.health Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.health Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/Checkpoint.html b/31/javadoc/org/apache/kafka/connect/mirror/Checkpoint.html index da46c4dfb..11443375d 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/Checkpoint.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/Checkpoint.html @@ -2,7 +2,7 @@ -Checkpoint (kafka 3.1.0 API) +Checkpoint (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/DefaultReplicationPolicy.html b/31/javadoc/org/apache/kafka/connect/mirror/DefaultReplicationPolicy.html index 67f26b1fb..7c9a0ddeb 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/DefaultReplicationPolicy.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/DefaultReplicationPolicy.html @@ -2,7 +2,7 @@ -DefaultReplicationPolicy (kafka 3.1.0 API) +DefaultReplicationPolicy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/Heartbeat.html b/31/javadoc/org/apache/kafka/connect/mirror/Heartbeat.html index 168f892e2..4008a91fe 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/Heartbeat.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/Heartbeat.html @@ -2,7 +2,7 @@ -Heartbeat (kafka 3.1.0 API) +Heartbeat (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/IdentityReplicationPolicy.html b/31/javadoc/org/apache/kafka/connect/mirror/IdentityReplicationPolicy.html index 516f419be..a66f36384 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/IdentityReplicationPolicy.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/IdentityReplicationPolicy.html @@ -2,7 +2,7 @@ -IdentityReplicationPolicy (kafka 3.1.0 API) +IdentityReplicationPolicy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/MirrorClient.html b/31/javadoc/org/apache/kafka/connect/mirror/MirrorClient.html index f14635ce5..852eda532 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/MirrorClient.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/MirrorClient.html @@ -2,7 +2,7 @@ -MirrorClient (kafka 3.1.0 API) +MirrorClient (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/MirrorClientConfig.html b/31/javadoc/org/apache/kafka/connect/mirror/MirrorClientConfig.html index f4ff030f3..10401ed77 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/MirrorClientConfig.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/MirrorClientConfig.html @@ -2,7 +2,7 @@ -MirrorClientConfig (kafka 3.1.0 API) +MirrorClientConfig (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/RemoteClusterUtils.html b/31/javadoc/org/apache/kafka/connect/mirror/RemoteClusterUtils.html index 56ddf38cc..f5d408bb9 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/RemoteClusterUtils.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/RemoteClusterUtils.html @@ -2,7 +2,7 @@ -RemoteClusterUtils (kafka 3.1.0 API) +RemoteClusterUtils (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/ReplicationPolicy.html b/31/javadoc/org/apache/kafka/connect/mirror/ReplicationPolicy.html index 0208546fa..b53cbee93 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/ReplicationPolicy.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/ReplicationPolicy.html @@ -2,7 +2,7 @@ -ReplicationPolicy (kafka 3.1.0 API) +ReplicationPolicy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/SourceAndTarget.html b/31/javadoc/org/apache/kafka/connect/mirror/SourceAndTarget.html index 14bb39986..7937e63ed 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/SourceAndTarget.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/SourceAndTarget.html @@ -2,7 +2,7 @@ -SourceAndTarget (kafka 3.1.0 API) +SourceAndTarget (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/package-summary.html b/31/javadoc/org/apache/kafka/connect/mirror/package-summary.html index 45191963a..926b68f36 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.mirror (kafka 3.1.0 API) +org.apache.kafka.connect.mirror (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/mirror/package-tree.html b/31/javadoc/org/apache/kafka/connect/mirror/package-tree.html index 4daa2c52a..eea3d99d4 100644 --- a/31/javadoc/org/apache/kafka/connect/mirror/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/mirror/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.mirror Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.mirror Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/rest/ConnectRestExtension.html b/31/javadoc/org/apache/kafka/connect/rest/ConnectRestExtension.html index 306f71ea4..ffce74456 100644 --- a/31/javadoc/org/apache/kafka/connect/rest/ConnectRestExtension.html +++ b/31/javadoc/org/apache/kafka/connect/rest/ConnectRestExtension.html @@ -2,7 +2,7 @@ -ConnectRestExtension (kafka 3.1.0 API) +ConnectRestExtension (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/rest/ConnectRestExtensionContext.html b/31/javadoc/org/apache/kafka/connect/rest/ConnectRestExtensionContext.html index 7e86e55f1..0135c4d5d 100644 --- a/31/javadoc/org/apache/kafka/connect/rest/ConnectRestExtensionContext.html +++ b/31/javadoc/org/apache/kafka/connect/rest/ConnectRestExtensionContext.html @@ -2,7 +2,7 @@ -ConnectRestExtensionContext (kafka 3.1.0 API) +ConnectRestExtensionContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/rest/package-summary.html b/31/javadoc/org/apache/kafka/connect/rest/package-summary.html index 4bbc1dc9f..a58a786b4 100644 --- a/31/javadoc/org/apache/kafka/connect/rest/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/rest/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.rest (kafka 3.1.0 API) +org.apache.kafka.connect.rest (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/rest/package-tree.html b/31/javadoc/org/apache/kafka/connect/rest/package-tree.html index c19304e87..f5fd4d50c 100644 --- a/31/javadoc/org/apache/kafka/connect/rest/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/rest/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.rest Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.rest Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/sink/ErrantRecordReporter.html b/31/javadoc/org/apache/kafka/connect/sink/ErrantRecordReporter.html index f11bb8e68..4326c5d45 100644 --- a/31/javadoc/org/apache/kafka/connect/sink/ErrantRecordReporter.html +++ b/31/javadoc/org/apache/kafka/connect/sink/ErrantRecordReporter.html @@ -2,7 +2,7 @@ -ErrantRecordReporter (kafka 3.1.0 API) +ErrantRecordReporter (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/sink/SinkConnector.html b/31/javadoc/org/apache/kafka/connect/sink/SinkConnector.html index b7f9e7844..d3e2b9d1e 100644 --- a/31/javadoc/org/apache/kafka/connect/sink/SinkConnector.html +++ b/31/javadoc/org/apache/kafka/connect/sink/SinkConnector.html @@ -2,7 +2,7 @@ -SinkConnector (kafka 3.1.0 API) +SinkConnector (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/sink/SinkConnectorContext.html b/31/javadoc/org/apache/kafka/connect/sink/SinkConnectorContext.html index 0e4b2a0af..b4fa61651 100644 --- a/31/javadoc/org/apache/kafka/connect/sink/SinkConnectorContext.html +++ b/31/javadoc/org/apache/kafka/connect/sink/SinkConnectorContext.html @@ -2,7 +2,7 @@ -SinkConnectorContext (kafka 3.1.0 API) +SinkConnectorContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/sink/SinkRecord.html b/31/javadoc/org/apache/kafka/connect/sink/SinkRecord.html index 0487295ce..bf09b4f89 100644 --- a/31/javadoc/org/apache/kafka/connect/sink/SinkRecord.html +++ b/31/javadoc/org/apache/kafka/connect/sink/SinkRecord.html @@ -2,7 +2,7 @@ -SinkRecord (kafka 3.1.0 API) +SinkRecord (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/sink/SinkTask.html b/31/javadoc/org/apache/kafka/connect/sink/SinkTask.html index 3dd7d37b3..739564f86 100644 --- a/31/javadoc/org/apache/kafka/connect/sink/SinkTask.html +++ b/31/javadoc/org/apache/kafka/connect/sink/SinkTask.html @@ -2,7 +2,7 @@ -SinkTask (kafka 3.1.0 API) +SinkTask (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/sink/SinkTaskContext.html b/31/javadoc/org/apache/kafka/connect/sink/SinkTaskContext.html index d98551727..a09c7ed8d 100644 --- a/31/javadoc/org/apache/kafka/connect/sink/SinkTaskContext.html +++ b/31/javadoc/org/apache/kafka/connect/sink/SinkTaskContext.html @@ -2,7 +2,7 @@ -SinkTaskContext (kafka 3.1.0 API) +SinkTaskContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/sink/package-summary.html b/31/javadoc/org/apache/kafka/connect/sink/package-summary.html index 65b92771b..0e9cf373a 100644 --- a/31/javadoc/org/apache/kafka/connect/sink/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/sink/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.sink (kafka 3.1.0 API) +org.apache.kafka.connect.sink (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/sink/package-tree.html b/31/javadoc/org/apache/kafka/connect/sink/package-tree.html index 7e606922e..e345e0b5f 100644 --- a/31/javadoc/org/apache/kafka/connect/sink/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/sink/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.sink Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.sink Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/source/SourceConnector.html b/31/javadoc/org/apache/kafka/connect/source/SourceConnector.html index da9afbe1f..4fb1a4600 100644 --- a/31/javadoc/org/apache/kafka/connect/source/SourceConnector.html +++ b/31/javadoc/org/apache/kafka/connect/source/SourceConnector.html @@ -2,7 +2,7 @@ -SourceConnector (kafka 3.1.0 API) +SourceConnector (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/source/SourceConnectorContext.html b/31/javadoc/org/apache/kafka/connect/source/SourceConnectorContext.html index 998bf2686..05cb4bba9 100644 --- a/31/javadoc/org/apache/kafka/connect/source/SourceConnectorContext.html +++ b/31/javadoc/org/apache/kafka/connect/source/SourceConnectorContext.html @@ -2,7 +2,7 @@ -SourceConnectorContext (kafka 3.1.0 API) +SourceConnectorContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/source/SourceRecord.html b/31/javadoc/org/apache/kafka/connect/source/SourceRecord.html index 55c0a6528..14898ead7 100644 --- a/31/javadoc/org/apache/kafka/connect/source/SourceRecord.html +++ b/31/javadoc/org/apache/kafka/connect/source/SourceRecord.html @@ -2,7 +2,7 @@ -SourceRecord (kafka 3.1.0 API) +SourceRecord (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/source/SourceTask.html b/31/javadoc/org/apache/kafka/connect/source/SourceTask.html index 2658b6c04..5d2ae3773 100644 --- a/31/javadoc/org/apache/kafka/connect/source/SourceTask.html +++ b/31/javadoc/org/apache/kafka/connect/source/SourceTask.html @@ -2,7 +2,7 @@ -SourceTask (kafka 3.1.0 API) +SourceTask (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/source/SourceTaskContext.html b/31/javadoc/org/apache/kafka/connect/source/SourceTaskContext.html index 0ca360067..8869bd206 100644 --- a/31/javadoc/org/apache/kafka/connect/source/SourceTaskContext.html +++ b/31/javadoc/org/apache/kafka/connect/source/SourceTaskContext.html @@ -2,7 +2,7 @@ -SourceTaskContext (kafka 3.1.0 API) +SourceTaskContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/source/package-summary.html b/31/javadoc/org/apache/kafka/connect/source/package-summary.html index d17c6b7ac..7c0d22bde 100644 --- a/31/javadoc/org/apache/kafka/connect/source/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/source/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.source (kafka 3.1.0 API) +org.apache.kafka.connect.source (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/source/package-tree.html b/31/javadoc/org/apache/kafka/connect/source/package-tree.html index 6616a2952..1697a56e4 100644 --- a/31/javadoc/org/apache/kafka/connect/source/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/source/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.source Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.source Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/Converter.html b/31/javadoc/org/apache/kafka/connect/storage/Converter.html index aeef374ac..a245c5edb 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/Converter.html +++ b/31/javadoc/org/apache/kafka/connect/storage/Converter.html @@ -2,7 +2,7 @@ -Converter (kafka 3.1.0 API) +Converter (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/ConverterConfig.html b/31/javadoc/org/apache/kafka/connect/storage/ConverterConfig.html index 08b475d89..e0fcb3b24 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/ConverterConfig.html +++ b/31/javadoc/org/apache/kafka/connect/storage/ConverterConfig.html @@ -2,7 +2,7 @@ -ConverterConfig (kafka 3.1.0 API) +ConverterConfig (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/ConverterType.html b/31/javadoc/org/apache/kafka/connect/storage/ConverterType.html index a5c0d681f..2da105824 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/ConverterType.html +++ b/31/javadoc/org/apache/kafka/connect/storage/ConverterType.html @@ -2,7 +2,7 @@ -ConverterType (kafka 3.1.0 API) +ConverterType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/HeaderConverter.html b/31/javadoc/org/apache/kafka/connect/storage/HeaderConverter.html index b0b45250f..80c1f92f3 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/HeaderConverter.html +++ b/31/javadoc/org/apache/kafka/connect/storage/HeaderConverter.html @@ -2,7 +2,7 @@ -HeaderConverter (kafka 3.1.0 API) +HeaderConverter (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/OffsetStorageReader.html b/31/javadoc/org/apache/kafka/connect/storage/OffsetStorageReader.html index 75a78d87d..3ece75b2b 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/OffsetStorageReader.html +++ b/31/javadoc/org/apache/kafka/connect/storage/OffsetStorageReader.html @@ -2,7 +2,7 @@ -OffsetStorageReader (kafka 3.1.0 API) +OffsetStorageReader (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/SimpleHeaderConverter.html b/31/javadoc/org/apache/kafka/connect/storage/SimpleHeaderConverter.html index 072114e51..5c981c308 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/SimpleHeaderConverter.html +++ b/31/javadoc/org/apache/kafka/connect/storage/SimpleHeaderConverter.html @@ -2,7 +2,7 @@ -SimpleHeaderConverter (kafka 3.1.0 API) +SimpleHeaderConverter (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/StringConverter.html b/31/javadoc/org/apache/kafka/connect/storage/StringConverter.html index 650bd8700..a96d05d28 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/StringConverter.html +++ b/31/javadoc/org/apache/kafka/connect/storage/StringConverter.html @@ -2,7 +2,7 @@ -StringConverter (kafka 3.1.0 API) +StringConverter (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/StringConverterConfig.html b/31/javadoc/org/apache/kafka/connect/storage/StringConverterConfig.html index a5a5c31be..fa1ef2da0 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/StringConverterConfig.html +++ b/31/javadoc/org/apache/kafka/connect/storage/StringConverterConfig.html @@ -2,7 +2,7 @@ -StringConverterConfig (kafka 3.1.0 API) +StringConverterConfig (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/package-summary.html b/31/javadoc/org/apache/kafka/connect/storage/package-summary.html index 7ea5e1bd9..937099c34 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/storage/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.storage (kafka 3.1.0 API) +org.apache.kafka.connect.storage (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/storage/package-tree.html b/31/javadoc/org/apache/kafka/connect/storage/package-tree.html index f05041092..627dca6f2 100644 --- a/31/javadoc/org/apache/kafka/connect/storage/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/storage/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.storage Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.storage Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/transforms/Transformation.html b/31/javadoc/org/apache/kafka/connect/transforms/Transformation.html index adbb4a354..379c924b6 100644 --- a/31/javadoc/org/apache/kafka/connect/transforms/Transformation.html +++ b/31/javadoc/org/apache/kafka/connect/transforms/Transformation.html @@ -2,7 +2,7 @@ -Transformation (kafka 3.1.0 API) +Transformation (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/transforms/package-summary.html b/31/javadoc/org/apache/kafka/connect/transforms/package-summary.html index 8479f1387..fc6417f8d 100644 --- a/31/javadoc/org/apache/kafka/connect/transforms/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/transforms/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.transforms (kafka 3.1.0 API) +org.apache.kafka.connect.transforms (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/transforms/package-tree.html b/31/javadoc/org/apache/kafka/connect/transforms/package-tree.html index 7a9f13f90..05993dd69 100644 --- a/31/javadoc/org/apache/kafka/connect/transforms/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/transforms/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.transforms Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.transforms Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/transforms/predicates/Predicate.html b/31/javadoc/org/apache/kafka/connect/transforms/predicates/Predicate.html index 9d1ff3b84..875e38586 100644 --- a/31/javadoc/org/apache/kafka/connect/transforms/predicates/Predicate.html +++ b/31/javadoc/org/apache/kafka/connect/transforms/predicates/Predicate.html @@ -2,7 +2,7 @@ -Predicate (kafka 3.1.0 API) +Predicate (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/transforms/predicates/package-summary.html b/31/javadoc/org/apache/kafka/connect/transforms/predicates/package-summary.html index 07d78be8c..35916c633 100644 --- a/31/javadoc/org/apache/kafka/connect/transforms/predicates/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/transforms/predicates/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.transforms.predicates (kafka 3.1.0 API) +org.apache.kafka.connect.transforms.predicates (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/transforms/predicates/package-tree.html b/31/javadoc/org/apache/kafka/connect/transforms/predicates/package-tree.html index cb136fbe3..4d29fac24 100644 --- a/31/javadoc/org/apache/kafka/connect/transforms/predicates/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/transforms/predicates/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.transforms.predicates Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.transforms.predicates Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/util/ConnectorUtils.html b/31/javadoc/org/apache/kafka/connect/util/ConnectorUtils.html index 1802c2194..b425218f0 100644 --- a/31/javadoc/org/apache/kafka/connect/util/ConnectorUtils.html +++ b/31/javadoc/org/apache/kafka/connect/util/ConnectorUtils.html @@ -2,7 +2,7 @@ -ConnectorUtils (kafka 3.1.0 API) +ConnectorUtils (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/util/package-summary.html b/31/javadoc/org/apache/kafka/connect/util/package-summary.html index a041c97a6..94794a500 100644 --- a/31/javadoc/org/apache/kafka/connect/util/package-summary.html +++ b/31/javadoc/org/apache/kafka/connect/util/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.util (kafka 3.1.0 API) +org.apache.kafka.connect.util (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/connect/util/package-tree.html b/31/javadoc/org/apache/kafka/connect/util/package-tree.html index 7bd4ed7ee..b58cd7bdc 100644 --- a/31/javadoc/org/apache/kafka/connect/util/package-tree.html +++ b/31/javadoc/org/apache/kafka/connect/util/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.connect.util Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.connect.util Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/AclCreateResult.html b/31/javadoc/org/apache/kafka/server/authorizer/AclCreateResult.html index f1c6d1c52..82acd2e26 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/AclCreateResult.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/AclCreateResult.html @@ -2,7 +2,7 @@ -AclCreateResult (kafka 3.1.0 API) +AclCreateResult (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/AclDeleteResult.AclBindingDeleteResult.html b/31/javadoc/org/apache/kafka/server/authorizer/AclDeleteResult.AclBindingDeleteResult.html index fbea3938b..55ac855dd 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/AclDeleteResult.AclBindingDeleteResult.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/AclDeleteResult.AclBindingDeleteResult.html @@ -2,7 +2,7 @@ -AclDeleteResult.AclBindingDeleteResult (kafka 3.1.0 API) +AclDeleteResult.AclBindingDeleteResult (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/AclDeleteResult.html b/31/javadoc/org/apache/kafka/server/authorizer/AclDeleteResult.html index 897bd0e58..b1f7edbb2 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/AclDeleteResult.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/AclDeleteResult.html @@ -2,7 +2,7 @@ -AclDeleteResult (kafka 3.1.0 API) +AclDeleteResult (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/Action.html b/31/javadoc/org/apache/kafka/server/authorizer/Action.html index a779218f4..34ea3d7a6 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/Action.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/Action.html @@ -2,7 +2,7 @@ -Action (kafka 3.1.0 API) +Action (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/AuthorizableRequestContext.html b/31/javadoc/org/apache/kafka/server/authorizer/AuthorizableRequestContext.html index 1570a8fec..426982586 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/AuthorizableRequestContext.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/AuthorizableRequestContext.html @@ -2,7 +2,7 @@ -AuthorizableRequestContext (kafka 3.1.0 API) +AuthorizableRequestContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/AuthorizationResult.html b/31/javadoc/org/apache/kafka/server/authorizer/AuthorizationResult.html index 7b98c1ef1..5285865bc 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/AuthorizationResult.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/AuthorizationResult.html @@ -2,7 +2,7 @@ -AuthorizationResult (kafka 3.1.0 API) +AuthorizationResult (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/Authorizer.html b/31/javadoc/org/apache/kafka/server/authorizer/Authorizer.html index 47809e252..650fc9c35 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/Authorizer.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/Authorizer.html @@ -2,7 +2,7 @@ -Authorizer (kafka 3.1.0 API) +Authorizer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/AuthorizerServerInfo.html b/31/javadoc/org/apache/kafka/server/authorizer/AuthorizerServerInfo.html index c895ebabf..84286d6b4 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/AuthorizerServerInfo.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/AuthorizerServerInfo.html @@ -2,7 +2,7 @@ -AuthorizerServerInfo (kafka 3.1.0 API) +AuthorizerServerInfo (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/package-summary.html b/31/javadoc/org/apache/kafka/server/authorizer/package-summary.html index a25cc057d..ced759adb 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/package-summary.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.server.authorizer (kafka 3.1.0 API) +org.apache.kafka.server.authorizer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/authorizer/package-tree.html b/31/javadoc/org/apache/kafka/server/authorizer/package-tree.html index 7290cc9b8..57b9035af 100644 --- a/31/javadoc/org/apache/kafka/server/authorizer/package-tree.html +++ b/31/javadoc/org/apache/kafka/server/authorizer/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.server.authorizer Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.server.authorizer Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/LogSegmentData.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/LogSegmentData.html index 77897aa6f..efb2ca9df 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/LogSegmentData.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/LogSegmentData.html @@ -2,7 +2,7 @@ -LogSegmentData (kafka 3.1.0 API) +LogSegmentData (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogMetadata.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogMetadata.html index 254cb0cdc..6ab0c29f0 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogMetadata.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogMetadata.html @@ -2,7 +2,7 @@ -RemoteLogMetadata (kafka 3.1.0 API) +RemoteLogMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogMetadataManager.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogMetadataManager.html index f6d3ad74c..abea056a2 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogMetadataManager.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogMetadataManager.html @@ -2,7 +2,7 @@ -RemoteLogMetadataManager (kafka 3.1.0 API) +RemoteLogMetadataManager (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentId.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentId.html index b09d42b55..f75f05d91 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentId.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentId.html @@ -2,7 +2,7 @@ -RemoteLogSegmentId (kafka 3.1.0 API) +RemoteLogSegmentId (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentMetadata.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentMetadata.html index 5225ef965..fc6b6091d 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentMetadata.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentMetadata.html @@ -2,7 +2,7 @@ -RemoteLogSegmentMetadata (kafka 3.1.0 API) +RemoteLogSegmentMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentMetadataUpdate.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentMetadataUpdate.html index d6318fdc6..0d1710033 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentMetadataUpdate.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentMetadataUpdate.html @@ -2,7 +2,7 @@ -RemoteLogSegmentMetadataUpdate (kafka 3.1.0 API) +RemoteLogSegmentMetadataUpdate (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentState.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentState.html index 7a0f1b4c5..632317d06 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentState.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteLogSegmentState.html @@ -2,7 +2,7 @@ -RemoteLogSegmentState (kafka 3.1.0 API) +RemoteLogSegmentState (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemotePartitionDeleteMetadata.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemotePartitionDeleteMetadata.html index e1d62cca6..cc75beba5 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemotePartitionDeleteMetadata.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemotePartitionDeleteMetadata.html @@ -2,7 +2,7 @@ -RemotePartitionDeleteMetadata (kafka 3.1.0 API) +RemotePartitionDeleteMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemotePartitionDeleteState.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemotePartitionDeleteState.html index 724f31b27..dc2142cb5 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemotePartitionDeleteState.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemotePartitionDeleteState.html @@ -2,7 +2,7 @@ -RemotePartitionDeleteState (kafka 3.1.0 API) +RemotePartitionDeleteState (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteResourceNotFoundException.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteResourceNotFoundException.html index 79678eb5d..b68b04920 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteResourceNotFoundException.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteResourceNotFoundException.html @@ -2,7 +2,7 @@ -RemoteResourceNotFoundException (kafka 3.1.0 API) +RemoteResourceNotFoundException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageException.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageException.html index c01225b09..fc036635e 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageException.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageException.html @@ -2,7 +2,7 @@ -RemoteStorageException (kafka 3.1.0 API) +RemoteStorageException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageManager.IndexType.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageManager.IndexType.html index 7a5e19a85..3513cabeb 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageManager.IndexType.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageManager.IndexType.html @@ -2,7 +2,7 @@ -RemoteStorageManager.IndexType (kafka 3.1.0 API) +RemoteStorageManager.IndexType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageManager.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageManager.html index b633f4aaa..dc59af85a 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageManager.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/RemoteStorageManager.html @@ -2,7 +2,7 @@ -RemoteStorageManager (kafka 3.1.0 API) +RemoteStorageManager (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/package-summary.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/package-summary.html index a9d54b92f..e6a76106a 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/package-summary.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.server.log.remote.storage (kafka 3.1.0 API) +org.apache.kafka.server.log.remote.storage (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/log/remote/storage/package-tree.html b/31/javadoc/org/apache/kafka/server/log/remote/storage/package-tree.html index 333000ca9..7a4158acf 100644 --- a/31/javadoc/org/apache/kafka/server/log/remote/storage/package-tree.html +++ b/31/javadoc/org/apache/kafka/server/log/remote/storage/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.server.log.remote.storage Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.server.log.remote.storage Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/policy/AlterConfigPolicy.RequestMetadata.html b/31/javadoc/org/apache/kafka/server/policy/AlterConfigPolicy.RequestMetadata.html index 584709fe2..230c51b5c 100644 --- a/31/javadoc/org/apache/kafka/server/policy/AlterConfigPolicy.RequestMetadata.html +++ b/31/javadoc/org/apache/kafka/server/policy/AlterConfigPolicy.RequestMetadata.html @@ -2,7 +2,7 @@ -AlterConfigPolicy.RequestMetadata (kafka 3.1.0 API) +AlterConfigPolicy.RequestMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/policy/AlterConfigPolicy.html b/31/javadoc/org/apache/kafka/server/policy/AlterConfigPolicy.html index 2df5316f1..d8072c46d 100644 --- a/31/javadoc/org/apache/kafka/server/policy/AlterConfigPolicy.html +++ b/31/javadoc/org/apache/kafka/server/policy/AlterConfigPolicy.html @@ -2,7 +2,7 @@ -AlterConfigPolicy (kafka 3.1.0 API) +AlterConfigPolicy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/policy/CreateTopicPolicy.RequestMetadata.html b/31/javadoc/org/apache/kafka/server/policy/CreateTopicPolicy.RequestMetadata.html index aa505a57e..a00e7db49 100644 --- a/31/javadoc/org/apache/kafka/server/policy/CreateTopicPolicy.RequestMetadata.html +++ b/31/javadoc/org/apache/kafka/server/policy/CreateTopicPolicy.RequestMetadata.html @@ -2,7 +2,7 @@ -CreateTopicPolicy.RequestMetadata (kafka 3.1.0 API) +CreateTopicPolicy.RequestMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/policy/CreateTopicPolicy.html b/31/javadoc/org/apache/kafka/server/policy/CreateTopicPolicy.html index 20d1759dc..a6a80cdca 100644 --- a/31/javadoc/org/apache/kafka/server/policy/CreateTopicPolicy.html +++ b/31/javadoc/org/apache/kafka/server/policy/CreateTopicPolicy.html @@ -2,7 +2,7 @@ -CreateTopicPolicy (kafka 3.1.0 API) +CreateTopicPolicy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/policy/package-summary.html b/31/javadoc/org/apache/kafka/server/policy/package-summary.html index 66a491c48..15aae81d7 100644 --- a/31/javadoc/org/apache/kafka/server/policy/package-summary.html +++ b/31/javadoc/org/apache/kafka/server/policy/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.server.policy (kafka 3.1.0 API) +org.apache.kafka.server.policy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/policy/package-tree.html b/31/javadoc/org/apache/kafka/server/policy/package-tree.html index 3b1bebb42..01a8af627 100644 --- a/31/javadoc/org/apache/kafka/server/policy/package-tree.html +++ b/31/javadoc/org/apache/kafka/server/policy/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.server.policy Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.server.policy Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaCallback.html b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaCallback.html index d867d6754..fb8f5e63d 100644 --- a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaCallback.html +++ b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaCallback.html @@ -2,7 +2,7 @@ -ClientQuotaCallback (kafka 3.1.0 API) +ClientQuotaCallback (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.ConfigEntity.html b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.ConfigEntity.html index 11cd794aa..e52870de0 100644 --- a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.ConfigEntity.html +++ b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.ConfigEntity.html @@ -2,7 +2,7 @@ -ClientQuotaEntity.ConfigEntity (kafka 3.1.0 API) +ClientQuotaEntity.ConfigEntity (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.ConfigEntityType.html b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.ConfigEntityType.html index 69ced49da..c0f8ad505 100644 --- a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.ConfigEntityType.html +++ b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.ConfigEntityType.html @@ -2,7 +2,7 @@ -ClientQuotaEntity.ConfigEntityType (kafka 3.1.0 API) +ClientQuotaEntity.ConfigEntityType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.html b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.html index 694ac8f05..77bfa94bd 100644 --- a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.html +++ b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaEntity.html @@ -2,7 +2,7 @@ -ClientQuotaEntity (kafka 3.1.0 API) +ClientQuotaEntity (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaType.html b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaType.html index a311fa0cf..e8560ade9 100644 --- a/31/javadoc/org/apache/kafka/server/quota/ClientQuotaType.html +++ b/31/javadoc/org/apache/kafka/server/quota/ClientQuotaType.html @@ -2,7 +2,7 @@ -ClientQuotaType (kafka 3.1.0 API) +ClientQuotaType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/quota/package-summary.html b/31/javadoc/org/apache/kafka/server/quota/package-summary.html index c3c4e17f3..0eafb9b3a 100644 --- a/31/javadoc/org/apache/kafka/server/quota/package-summary.html +++ b/31/javadoc/org/apache/kafka/server/quota/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.server.quota (kafka 3.1.0 API) +org.apache.kafka.server.quota (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/server/quota/package-tree.html b/31/javadoc/org/apache/kafka/server/quota/package-tree.html index 40d887359..1eee8878f 100644 --- a/31/javadoc/org/apache/kafka/server/quota/package-tree.html +++ b/31/javadoc/org/apache/kafka/server/quota/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.server.quota Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.server.quota Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/KafkaClientSupplier.html b/31/javadoc/org/apache/kafka/streams/KafkaClientSupplier.html index fdee628be..3caccbd44 100644 --- a/31/javadoc/org/apache/kafka/streams/KafkaClientSupplier.html +++ b/31/javadoc/org/apache/kafka/streams/KafkaClientSupplier.html @@ -2,7 +2,7 @@ -KafkaClientSupplier (kafka 3.1.0 API) +KafkaClientSupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/KafkaStreams.State.html b/31/javadoc/org/apache/kafka/streams/KafkaStreams.State.html index f71c21d03..e1c18271a 100644 --- a/31/javadoc/org/apache/kafka/streams/KafkaStreams.State.html +++ b/31/javadoc/org/apache/kafka/streams/KafkaStreams.State.html @@ -2,7 +2,7 @@ -KafkaStreams.State (kafka 3.1.0 API) +KafkaStreams.State (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/KafkaStreams.StateListener.html b/31/javadoc/org/apache/kafka/streams/KafkaStreams.StateListener.html index 4b4b988ca..a04e483cd 100644 --- a/31/javadoc/org/apache/kafka/streams/KafkaStreams.StateListener.html +++ b/31/javadoc/org/apache/kafka/streams/KafkaStreams.StateListener.html @@ -2,7 +2,7 @@ -KafkaStreams.StateListener (kafka 3.1.0 API) +KafkaStreams.StateListener (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/KafkaStreams.html b/31/javadoc/org/apache/kafka/streams/KafkaStreams.html index 22b68cdf4..647a19270 100644 --- a/31/javadoc/org/apache/kafka/streams/KafkaStreams.html +++ b/31/javadoc/org/apache/kafka/streams/KafkaStreams.html @@ -2,7 +2,7 @@ -KafkaStreams (kafka 3.1.0 API) +KafkaStreams (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/KeyQueryMetadata.html b/31/javadoc/org/apache/kafka/streams/KeyQueryMetadata.html index b1d9819bb..1a203226e 100644 --- a/31/javadoc/org/apache/kafka/streams/KeyQueryMetadata.html +++ b/31/javadoc/org/apache/kafka/streams/KeyQueryMetadata.html @@ -2,7 +2,7 @@ -KeyQueryMetadata (kafka 3.1.0 API) +KeyQueryMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/KeyValue.html b/31/javadoc/org/apache/kafka/streams/KeyValue.html index de56904a0..a92c6533e 100644 --- a/31/javadoc/org/apache/kafka/streams/KeyValue.html +++ b/31/javadoc/org/apache/kafka/streams/KeyValue.html @@ -2,7 +2,7 @@ -KeyValue (kafka 3.1.0 API) +KeyValue (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/LagInfo.html b/31/javadoc/org/apache/kafka/streams/LagInfo.html index a2c3f559d..3608510d8 100644 --- a/31/javadoc/org/apache/kafka/streams/LagInfo.html +++ b/31/javadoc/org/apache/kafka/streams/LagInfo.html @@ -2,7 +2,7 @@ -LagInfo (kafka 3.1.0 API) +LagInfo (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/StoreQueryParameters.html b/31/javadoc/org/apache/kafka/streams/StoreQueryParameters.html index b2d72a58d..2c3db8ccb 100644 --- a/31/javadoc/org/apache/kafka/streams/StoreQueryParameters.html +++ b/31/javadoc/org/apache/kafka/streams/StoreQueryParameters.html @@ -2,7 +2,7 @@ -StoreQueryParameters (kafka 3.1.0 API) +StoreQueryParameters (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/StreamsBuilder.html b/31/javadoc/org/apache/kafka/streams/StreamsBuilder.html index 0689a9eb9..81641036c 100644 --- a/31/javadoc/org/apache/kafka/streams/StreamsBuilder.html +++ b/31/javadoc/org/apache/kafka/streams/StreamsBuilder.html @@ -2,7 +2,7 @@ -StreamsBuilder (kafka 3.1.0 API) +StreamsBuilder (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/StreamsConfig.InternalConfig.html b/31/javadoc/org/apache/kafka/streams/StreamsConfig.InternalConfig.html index 686583c59..f0786b197 100644 --- a/31/javadoc/org/apache/kafka/streams/StreamsConfig.InternalConfig.html +++ b/31/javadoc/org/apache/kafka/streams/StreamsConfig.InternalConfig.html @@ -2,7 +2,7 @@ -StreamsConfig.InternalConfig (kafka 3.1.0 API) +StreamsConfig.InternalConfig (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/StreamsConfig.html b/31/javadoc/org/apache/kafka/streams/StreamsConfig.html index 5ccf114cf..a83cb765f 100644 --- a/31/javadoc/org/apache/kafka/streams/StreamsConfig.html +++ b/31/javadoc/org/apache/kafka/streams/StreamsConfig.html @@ -2,7 +2,7 @@ -StreamsConfig (kafka 3.1.0 API) +StreamsConfig (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/StreamsMetadata.html b/31/javadoc/org/apache/kafka/streams/StreamsMetadata.html index fb79fdbdf..7db8542bd 100644 --- a/31/javadoc/org/apache/kafka/streams/StreamsMetadata.html +++ b/31/javadoc/org/apache/kafka/streams/StreamsMetadata.html @@ -2,7 +2,7 @@ -StreamsMetadata (kafka 3.1.0 API) +StreamsMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/StreamsMetrics.html b/31/javadoc/org/apache/kafka/streams/StreamsMetrics.html index 2b1c5a317..7d30e0cdb 100644 --- a/31/javadoc/org/apache/kafka/streams/StreamsMetrics.html +++ b/31/javadoc/org/apache/kafka/streams/StreamsMetrics.html @@ -2,7 +2,7 @@ -StreamsMetrics (kafka 3.1.0 API) +StreamsMetrics (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TaskMetadata.html b/31/javadoc/org/apache/kafka/streams/TaskMetadata.html index 87c285071..d385d3f9e 100644 --- a/31/javadoc/org/apache/kafka/streams/TaskMetadata.html +++ b/31/javadoc/org/apache/kafka/streams/TaskMetadata.html @@ -2,7 +2,7 @@ -TaskMetadata (kafka 3.1.0 API) +TaskMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TestInputTopic.html b/31/javadoc/org/apache/kafka/streams/TestInputTopic.html index ff5234951..d7a14ef97 100644 --- a/31/javadoc/org/apache/kafka/streams/TestInputTopic.html +++ b/31/javadoc/org/apache/kafka/streams/TestInputTopic.html @@ -2,7 +2,7 @@ -TestInputTopic (kafka 3.1.0 API) +TestInputTopic (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TestOutputTopic.html b/31/javadoc/org/apache/kafka/streams/TestOutputTopic.html index 6a3bf4104..f0f581204 100644 --- a/31/javadoc/org/apache/kafka/streams/TestOutputTopic.html +++ b/31/javadoc/org/apache/kafka/streams/TestOutputTopic.html @@ -2,7 +2,7 @@ -TestOutputTopic (kafka 3.1.0 API) +TestOutputTopic (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/ThreadMetadata.html b/31/javadoc/org/apache/kafka/streams/ThreadMetadata.html index b90775ac6..c89c28a4d 100644 --- a/31/javadoc/org/apache/kafka/streams/ThreadMetadata.html +++ b/31/javadoc/org/apache/kafka/streams/ThreadMetadata.html @@ -2,7 +2,7 @@ -ThreadMetadata (kafka 3.1.0 API) +ThreadMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/Topology.AutoOffsetReset.html b/31/javadoc/org/apache/kafka/streams/Topology.AutoOffsetReset.html index 73d3669a7..4c1f0691b 100644 --- a/31/javadoc/org/apache/kafka/streams/Topology.AutoOffsetReset.html +++ b/31/javadoc/org/apache/kafka/streams/Topology.AutoOffsetReset.html @@ -2,7 +2,7 @@ -Topology.AutoOffsetReset (kafka 3.1.0 API) +Topology.AutoOffsetReset (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/Topology.html b/31/javadoc/org/apache/kafka/streams/Topology.html index 295950e14..45a3a93d8 100644 --- a/31/javadoc/org/apache/kafka/streams/Topology.html +++ b/31/javadoc/org/apache/kafka/streams/Topology.html @@ -2,7 +2,7 @@ -Topology (kafka 3.1.0 API) +Topology (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TopologyDescription.GlobalStore.html b/31/javadoc/org/apache/kafka/streams/TopologyDescription.GlobalStore.html index 2a123c615..bca3a1ab5 100644 --- a/31/javadoc/org/apache/kafka/streams/TopologyDescription.GlobalStore.html +++ b/31/javadoc/org/apache/kafka/streams/TopologyDescription.GlobalStore.html @@ -2,7 +2,7 @@ -TopologyDescription.GlobalStore (kafka 3.1.0 API) +TopologyDescription.GlobalStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Node.html b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Node.html index 31d38a7e2..6716f68b4 100644 --- a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Node.html +++ b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Node.html @@ -2,7 +2,7 @@ -TopologyDescription.Node (kafka 3.1.0 API) +TopologyDescription.Node (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Processor.html b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Processor.html index f289f0b72..c03e6c12d 100644 --- a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Processor.html +++ b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Processor.html @@ -2,7 +2,7 @@ -TopologyDescription.Processor (kafka 3.1.0 API) +TopologyDescription.Processor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Sink.html b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Sink.html index 3f6c36134..bd900cb97 100644 --- a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Sink.html +++ b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Sink.html @@ -2,7 +2,7 @@ -TopologyDescription.Sink (kafka 3.1.0 API) +TopologyDescription.Sink (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Source.html b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Source.html index 6466d6203..8beacf6f5 100644 --- a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Source.html +++ b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Source.html @@ -2,7 +2,7 @@ -TopologyDescription.Source (kafka 3.1.0 API) +TopologyDescription.Source (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Subtopology.html b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Subtopology.html index 22d049860..d0908b126 100644 --- a/31/javadoc/org/apache/kafka/streams/TopologyDescription.Subtopology.html +++ b/31/javadoc/org/apache/kafka/streams/TopologyDescription.Subtopology.html @@ -2,7 +2,7 @@ -TopologyDescription.Subtopology (kafka 3.1.0 API) +TopologyDescription.Subtopology (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TopologyDescription.html b/31/javadoc/org/apache/kafka/streams/TopologyDescription.html index 1a2530b62..af3b4e61a 100644 --- a/31/javadoc/org/apache/kafka/streams/TopologyDescription.html +++ b/31/javadoc/org/apache/kafka/streams/TopologyDescription.html @@ -2,7 +2,7 @@ -TopologyDescription (kafka 3.1.0 API) +TopologyDescription (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/TopologyTestDriver.html b/31/javadoc/org/apache/kafka/streams/TopologyTestDriver.html index 008a71bbe..ca2d0dfdf 100644 --- a/31/javadoc/org/apache/kafka/streams/TopologyTestDriver.html +++ b/31/javadoc/org/apache/kafka/streams/TopologyTestDriver.html @@ -2,7 +2,7 @@ -TopologyTestDriver (kafka 3.1.0 API) +TopologyTestDriver (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/BrokerNotFoundException.html b/31/javadoc/org/apache/kafka/streams/errors/BrokerNotFoundException.html index 42c58ff7f..93d3089de 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/BrokerNotFoundException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/BrokerNotFoundException.html @@ -2,7 +2,7 @@ -BrokerNotFoundException (kafka 3.1.0 API) +BrokerNotFoundException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/DefaultProductionExceptionHandler.html b/31/javadoc/org/apache/kafka/streams/errors/DefaultProductionExceptionHandler.html index 4f023dd93..af885a362 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/DefaultProductionExceptionHandler.html +++ b/31/javadoc/org/apache/kafka/streams/errors/DefaultProductionExceptionHandler.html @@ -2,7 +2,7 @@ -DefaultProductionExceptionHandler (kafka 3.1.0 API) +DefaultProductionExceptionHandler (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/DeserializationExceptionHandler.DeserializationHandlerResponse.html b/31/javadoc/org/apache/kafka/streams/errors/DeserializationExceptionHandler.DeserializationHandlerResponse.html index f9b5e0118..18f98e75d 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/DeserializationExceptionHandler.DeserializationHandlerResponse.html +++ b/31/javadoc/org/apache/kafka/streams/errors/DeserializationExceptionHandler.DeserializationHandlerResponse.html @@ -2,7 +2,7 @@ -DeserializationExceptionHandler.DeserializationHandlerResponse (kafka 3.1.0 API) +DeserializationExceptionHandler.DeserializationHandlerResponse (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/DeserializationExceptionHandler.html b/31/javadoc/org/apache/kafka/streams/errors/DeserializationExceptionHandler.html index 84781f54c..7b96c630b 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/DeserializationExceptionHandler.html +++ b/31/javadoc/org/apache/kafka/streams/errors/DeserializationExceptionHandler.html @@ -2,7 +2,7 @@ -DeserializationExceptionHandler (kafka 3.1.0 API) +DeserializationExceptionHandler (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/InvalidStateStoreException.html b/31/javadoc/org/apache/kafka/streams/errors/InvalidStateStoreException.html index d1aa9e4e5..d91f08e64 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/InvalidStateStoreException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/InvalidStateStoreException.html @@ -2,7 +2,7 @@ -InvalidStateStoreException (kafka 3.1.0 API) +InvalidStateStoreException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/InvalidStateStorePartitionException.html b/31/javadoc/org/apache/kafka/streams/errors/InvalidStateStorePartitionException.html index f57a3a285..3c674c057 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/InvalidStateStorePartitionException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/InvalidStateStorePartitionException.html @@ -2,7 +2,7 @@ -InvalidStateStorePartitionException (kafka 3.1.0 API) +InvalidStateStorePartitionException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/LockException.html b/31/javadoc/org/apache/kafka/streams/errors/LockException.html index 9109f6a82..a24ddf037 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/LockException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/LockException.html @@ -2,7 +2,7 @@ -LockException (kafka 3.1.0 API) +LockException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/LogAndContinueExceptionHandler.html b/31/javadoc/org/apache/kafka/streams/errors/LogAndContinueExceptionHandler.html index 51ff69bfd..6b4f4d98f 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/LogAndContinueExceptionHandler.html +++ b/31/javadoc/org/apache/kafka/streams/errors/LogAndContinueExceptionHandler.html @@ -2,7 +2,7 @@ -LogAndContinueExceptionHandler (kafka 3.1.0 API) +LogAndContinueExceptionHandler (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/LogAndFailExceptionHandler.html b/31/javadoc/org/apache/kafka/streams/errors/LogAndFailExceptionHandler.html index 1bfef4459..0629e7a17 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/LogAndFailExceptionHandler.html +++ b/31/javadoc/org/apache/kafka/streams/errors/LogAndFailExceptionHandler.html @@ -2,7 +2,7 @@ -LogAndFailExceptionHandler (kafka 3.1.0 API) +LogAndFailExceptionHandler (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/MissingSourceTopicException.html b/31/javadoc/org/apache/kafka/streams/errors/MissingSourceTopicException.html index 2a6881600..0acab68db 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/MissingSourceTopicException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/MissingSourceTopicException.html @@ -2,7 +2,7 @@ -MissingSourceTopicException (kafka 3.1.0 API) +MissingSourceTopicException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/ProcessorStateException.html b/31/javadoc/org/apache/kafka/streams/errors/ProcessorStateException.html index c72ab8ccb..a9f45022b 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/ProcessorStateException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/ProcessorStateException.html @@ -2,7 +2,7 @@ -ProcessorStateException (kafka 3.1.0 API) +ProcessorStateException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/ProductionExceptionHandler.ProductionExceptionHandlerResponse.html b/31/javadoc/org/apache/kafka/streams/errors/ProductionExceptionHandler.ProductionExceptionHandlerResponse.html index 204fa1fea..f23ff0d2f 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/ProductionExceptionHandler.ProductionExceptionHandlerResponse.html +++ b/31/javadoc/org/apache/kafka/streams/errors/ProductionExceptionHandler.ProductionExceptionHandlerResponse.html @@ -2,7 +2,7 @@ -ProductionExceptionHandler.ProductionExceptionHandlerResponse (kafka 3.1.0 API) +ProductionExceptionHandler.ProductionExceptionHandlerResponse (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/ProductionExceptionHandler.html b/31/javadoc/org/apache/kafka/streams/errors/ProductionExceptionHandler.html index 4bf92284a..5d6d36ade 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/ProductionExceptionHandler.html +++ b/31/javadoc/org/apache/kafka/streams/errors/ProductionExceptionHandler.html @@ -2,7 +2,7 @@ -ProductionExceptionHandler (kafka 3.1.0 API) +ProductionExceptionHandler (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/StateStoreMigratedException.html b/31/javadoc/org/apache/kafka/streams/errors/StateStoreMigratedException.html index b597a474a..2901c3b24 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/StateStoreMigratedException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/StateStoreMigratedException.html @@ -2,7 +2,7 @@ -StateStoreMigratedException (kafka 3.1.0 API) +StateStoreMigratedException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/StateStoreNotAvailableException.html b/31/javadoc/org/apache/kafka/streams/errors/StateStoreNotAvailableException.html index 1a7931d10..cc0570bf6 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/StateStoreNotAvailableException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/StateStoreNotAvailableException.html @@ -2,7 +2,7 @@ -StateStoreNotAvailableException (kafka 3.1.0 API) +StateStoreNotAvailableException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/StreamsException.html b/31/javadoc/org/apache/kafka/streams/errors/StreamsException.html index 903d74768..055966fed 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/StreamsException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/StreamsException.html @@ -2,7 +2,7 @@ -StreamsException (kafka 3.1.0 API) +StreamsException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/StreamsNotStartedException.html b/31/javadoc/org/apache/kafka/streams/errors/StreamsNotStartedException.html index a5fe22916..00d8eb43d 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/StreamsNotStartedException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/StreamsNotStartedException.html @@ -2,7 +2,7 @@ -StreamsNotStartedException (kafka 3.1.0 API) +StreamsNotStartedException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/StreamsRebalancingException.html b/31/javadoc/org/apache/kafka/streams/errors/StreamsRebalancingException.html index b19f32a71..0921d316f 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/StreamsRebalancingException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/StreamsRebalancingException.html @@ -2,7 +2,7 @@ -StreamsRebalancingException (kafka 3.1.0 API) +StreamsRebalancingException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/StreamsUncaughtExceptionHandler.StreamThreadExceptionResponse.html b/31/javadoc/org/apache/kafka/streams/errors/StreamsUncaughtExceptionHandler.StreamThreadExceptionResponse.html index 25f010efe..7b921bbda 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/StreamsUncaughtExceptionHandler.StreamThreadExceptionResponse.html +++ b/31/javadoc/org/apache/kafka/streams/errors/StreamsUncaughtExceptionHandler.StreamThreadExceptionResponse.html @@ -2,7 +2,7 @@ -StreamsUncaughtExceptionHandler.StreamThreadExceptionResponse (kafka 3.1.0 API) +StreamsUncaughtExceptionHandler.StreamThreadExceptionResponse (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/StreamsUncaughtExceptionHandler.html b/31/javadoc/org/apache/kafka/streams/errors/StreamsUncaughtExceptionHandler.html index ce2bb22e3..b03dcb1eb 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/StreamsUncaughtExceptionHandler.html +++ b/31/javadoc/org/apache/kafka/streams/errors/StreamsUncaughtExceptionHandler.html @@ -2,7 +2,7 @@ -StreamsUncaughtExceptionHandler (kafka 3.1.0 API) +StreamsUncaughtExceptionHandler (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/TaskAssignmentException.html b/31/javadoc/org/apache/kafka/streams/errors/TaskAssignmentException.html index dddef6dcc..5799a0812 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/TaskAssignmentException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/TaskAssignmentException.html @@ -2,7 +2,7 @@ -TaskAssignmentException (kafka 3.1.0 API) +TaskAssignmentException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/TaskCorruptedException.html b/31/javadoc/org/apache/kafka/streams/errors/TaskCorruptedException.html index a9df16382..5bb0d0130 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/TaskCorruptedException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/TaskCorruptedException.html @@ -2,7 +2,7 @@ -TaskCorruptedException (kafka 3.1.0 API) +TaskCorruptedException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/TaskIdFormatException.html b/31/javadoc/org/apache/kafka/streams/errors/TaskIdFormatException.html index db5b14e59..77e7dd40c 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/TaskIdFormatException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/TaskIdFormatException.html @@ -2,7 +2,7 @@ -TaskIdFormatException (kafka 3.1.0 API) +TaskIdFormatException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/TaskMigratedException.html b/31/javadoc/org/apache/kafka/streams/errors/TaskMigratedException.html index 2271e87fd..8a0a139ed 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/TaskMigratedException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/TaskMigratedException.html @@ -2,7 +2,7 @@ -TaskMigratedException (kafka 3.1.0 API) +TaskMigratedException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/TopologyException.html b/31/javadoc/org/apache/kafka/streams/errors/TopologyException.html index f1572013e..8b6d90679 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/TopologyException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/TopologyException.html @@ -2,7 +2,7 @@ -TopologyException (kafka 3.1.0 API) +TopologyException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/UnknownStateStoreException.html b/31/javadoc/org/apache/kafka/streams/errors/UnknownStateStoreException.html index d81008b68..fa82b8659 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/UnknownStateStoreException.html +++ b/31/javadoc/org/apache/kafka/streams/errors/UnknownStateStoreException.html @@ -2,7 +2,7 @@ -UnknownStateStoreException (kafka 3.1.0 API) +UnknownStateStoreException (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/package-summary.html b/31/javadoc/org/apache/kafka/streams/errors/package-summary.html index 1c29d925b..aedbcb924 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/package-summary.html +++ b/31/javadoc/org/apache/kafka/streams/errors/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.errors (kafka 3.1.0 API) +org.apache.kafka.streams.errors (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/errors/package-tree.html b/31/javadoc/org/apache/kafka/streams/errors/package-tree.html index 16a8b6030..d41b6aa52 100644 --- a/31/javadoc/org/apache/kafka/streams/errors/package-tree.html +++ b/31/javadoc/org/apache/kafka/streams/errors/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.errors Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.streams.errors Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Aggregator.html b/31/javadoc/org/apache/kafka/streams/kstream/Aggregator.html index e7582d552..d3520cba6 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Aggregator.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Aggregator.html @@ -2,7 +2,7 @@ -Aggregator (kafka 3.1.0 API) +Aggregator (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Branched.html b/31/javadoc/org/apache/kafka/streams/kstream/Branched.html index 297253a73..a6a0e5a7a 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Branched.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Branched.html @@ -2,7 +2,7 @@ -Branched (kafka 3.1.0 API) +Branched (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/BranchedKStream.html b/31/javadoc/org/apache/kafka/streams/kstream/BranchedKStream.html index b1c084dd3..ad292189f 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/BranchedKStream.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/BranchedKStream.html @@ -2,7 +2,7 @@ -BranchedKStream (kafka 3.1.0 API) +BranchedKStream (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/CogroupedKStream.html b/31/javadoc/org/apache/kafka/streams/kstream/CogroupedKStream.html index cf6379f8f..8e840d088 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/CogroupedKStream.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/CogroupedKStream.html @@ -2,7 +2,7 @@ -CogroupedKStream (kafka 3.1.0 API) +CogroupedKStream (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Consumed.html b/31/javadoc/org/apache/kafka/streams/kstream/Consumed.html index 7334cb4e3..71073caae 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Consumed.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Consumed.html @@ -2,7 +2,7 @@ -Consumed (kafka 3.1.0 API) +Consumed (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ForeachAction.html b/31/javadoc/org/apache/kafka/streams/kstream/ForeachAction.html index 5dce22698..da287f218 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ForeachAction.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ForeachAction.html @@ -2,7 +2,7 @@ -ForeachAction (kafka 3.1.0 API) +ForeachAction (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ForeachProcessor.html b/31/javadoc/org/apache/kafka/streams/kstream/ForeachProcessor.html index a5471a2c5..e14149763 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ForeachProcessor.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ForeachProcessor.html @@ -2,7 +2,7 @@ -ForeachProcessor (kafka 3.1.0 API) +ForeachProcessor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/GlobalKTable.html b/31/javadoc/org/apache/kafka/streams/kstream/GlobalKTable.html index cd522e54b..13e89a2c4 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/GlobalKTable.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/GlobalKTable.html @@ -2,7 +2,7 @@ -GlobalKTable (kafka 3.1.0 API) +GlobalKTable (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Grouped.html b/31/javadoc/org/apache/kafka/streams/kstream/Grouped.html index 91961069c..4d341f52f 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Grouped.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Grouped.html @@ -2,7 +2,7 @@ -Grouped (kafka 3.1.0 API) +Grouped (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Initializer.html b/31/javadoc/org/apache/kafka/streams/kstream/Initializer.html index 165e9daa5..3ef2a5359 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Initializer.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Initializer.html @@ -2,7 +2,7 @@ -Initializer (kafka 3.1.0 API) +Initializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/JoinWindows.html b/31/javadoc/org/apache/kafka/streams/kstream/JoinWindows.html index 267ec7516..7865d9efb 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/JoinWindows.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/JoinWindows.html @@ -2,7 +2,7 @@ -JoinWindows (kafka 3.1.0 API) +JoinWindows (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Joined.html b/31/javadoc/org/apache/kafka/streams/kstream/Joined.html index ee1587ea3..94849fa6a 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Joined.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Joined.html @@ -2,7 +2,7 @@ -Joined (kafka 3.1.0 API) +Joined (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/KGroupedStream.html b/31/javadoc/org/apache/kafka/streams/kstream/KGroupedStream.html index 8edf76f33..27c1c4347 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/KGroupedStream.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/KGroupedStream.html @@ -2,7 +2,7 @@ -KGroupedStream (kafka 3.1.0 API) +KGroupedStream (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/KGroupedTable.html b/31/javadoc/org/apache/kafka/streams/kstream/KGroupedTable.html index 233041970..23e3c4c5f 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/KGroupedTable.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/KGroupedTable.html @@ -2,7 +2,7 @@ -KGroupedTable (kafka 3.1.0 API) +KGroupedTable (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/KStream.html b/31/javadoc/org/apache/kafka/streams/kstream/KStream.html index 77bb0978d..1a7b3781e 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/KStream.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/KStream.html @@ -2,7 +2,7 @@ -KStream (kafka 3.1.0 API) +KStream (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/KTable.html b/31/javadoc/org/apache/kafka/streams/kstream/KTable.html index c25eca643..3246d295c 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/KTable.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/KTable.html @@ -2,7 +2,7 @@ -KTable (kafka 3.1.0 API) +KTable (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/KeyValueMapper.html b/31/javadoc/org/apache/kafka/streams/kstream/KeyValueMapper.html index d753bee85..d68c4d187 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/KeyValueMapper.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/KeyValueMapper.html @@ -2,7 +2,7 @@ -KeyValueMapper (kafka 3.1.0 API) +KeyValueMapper (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Materialized.html b/31/javadoc/org/apache/kafka/streams/kstream/Materialized.html index 0641e58b3..fb12b1751 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Materialized.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Materialized.html @@ -2,7 +2,7 @@ -Materialized (kafka 3.1.0 API) +Materialized (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Merger.html b/31/javadoc/org/apache/kafka/streams/kstream/Merger.html index e9b8bf43a..0ce714550 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Merger.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Merger.html @@ -2,7 +2,7 @@ -Merger (kafka 3.1.0 API) +Merger (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Named.html b/31/javadoc/org/apache/kafka/streams/kstream/Named.html index 37b5c33ba..273735eab 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Named.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Named.html @@ -2,7 +2,7 @@ -Named (kafka 3.1.0 API) +Named (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Predicate.html b/31/javadoc/org/apache/kafka/streams/kstream/Predicate.html index 36230e883..381d5b6d4 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Predicate.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Predicate.html @@ -2,7 +2,7 @@ -Predicate (kafka 3.1.0 API) +Predicate (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Printed.html b/31/javadoc/org/apache/kafka/streams/kstream/Printed.html index d0ad41a68..5de0c908f 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Printed.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Printed.html @@ -2,7 +2,7 @@ -Printed (kafka 3.1.0 API) +Printed (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Produced.html b/31/javadoc/org/apache/kafka/streams/kstream/Produced.html index 685184d51..c1a805475 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Produced.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Produced.html @@ -2,7 +2,7 @@ -Produced (kafka 3.1.0 API) +Produced (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Reducer.html b/31/javadoc/org/apache/kafka/streams/kstream/Reducer.html index c5da84e5e..8a321b9e5 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Reducer.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Reducer.html @@ -2,7 +2,7 @@ -Reducer (kafka 3.1.0 API) +Reducer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Repartitioned.html b/31/javadoc/org/apache/kafka/streams/kstream/Repartitioned.html index 65fcb0cd4..36b270a35 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Repartitioned.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Repartitioned.html @@ -2,7 +2,7 @@ -Repartitioned (kafka 3.1.0 API) +Repartitioned (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedCogroupedKStream.html b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedCogroupedKStream.html index b5c914174..10662df6e 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedCogroupedKStream.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedCogroupedKStream.html @@ -2,7 +2,7 @@ -SessionWindowedCogroupedKStream (kafka 3.1.0 API) +SessionWindowedCogroupedKStream (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedDeserializer.html b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedDeserializer.html index f88dee531..df791fbfa 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedDeserializer.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedDeserializer.html @@ -2,7 +2,7 @@ -SessionWindowedDeserializer (kafka 3.1.0 API) +SessionWindowedDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedKStream.html b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedKStream.html index 6b72efdf9..59381167f 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedKStream.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedKStream.html @@ -2,7 +2,7 @@ -SessionWindowedKStream (kafka 3.1.0 API) +SessionWindowedKStream (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedSerializer.html b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedSerializer.html index 37dab3160..63fa97b9a 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedSerializer.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindowedSerializer.html @@ -2,7 +2,7 @@ -SessionWindowedSerializer (kafka 3.1.0 API) +SessionWindowedSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindows.html b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindows.html index a178a98ee..1f78468a5 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/SessionWindows.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/SessionWindows.html @@ -2,7 +2,7 @@ -SessionWindows (kafka 3.1.0 API) +SessionWindows (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/SlidingWindows.html b/31/javadoc/org/apache/kafka/streams/kstream/SlidingWindows.html index 3430f0e90..ca4d1381b 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/SlidingWindows.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/SlidingWindows.html @@ -2,7 +2,7 @@ -SlidingWindows (kafka 3.1.0 API) +SlidingWindows (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/StreamJoined.html b/31/javadoc/org/apache/kafka/streams/kstream/StreamJoined.html index 0536e3d07..89798a321 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/StreamJoined.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/StreamJoined.html @@ -2,7 +2,7 @@ -StreamJoined (kafka 3.1.0 API) +StreamJoined (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.BufferConfig.html b/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.BufferConfig.html index 834d15db4..ea2e75d66 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.BufferConfig.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.BufferConfig.html @@ -2,7 +2,7 @@ -Suppressed.BufferConfig (kafka 3.1.0 API) +Suppressed.BufferConfig (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.EagerBufferConfig.html b/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.EagerBufferConfig.html index 02a68bc59..4e969c07d 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.EagerBufferConfig.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.EagerBufferConfig.html @@ -2,7 +2,7 @@ -Suppressed.EagerBufferConfig (kafka 3.1.0 API) +Suppressed.EagerBufferConfig (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.StrictBufferConfig.html b/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.StrictBufferConfig.html index aee2b65f8..5772787e5 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.StrictBufferConfig.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.StrictBufferConfig.html @@ -2,7 +2,7 @@ -Suppressed.StrictBufferConfig (kafka 3.1.0 API) +Suppressed.StrictBufferConfig (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.html b/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.html index acbb22319..12fb44624 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Suppressed.html @@ -2,7 +2,7 @@ -Suppressed (kafka 3.1.0 API) +Suppressed (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/TableJoined.html b/31/javadoc/org/apache/kafka/streams/kstream/TableJoined.html index d254ad16e..2ac44ee22 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/TableJoined.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/TableJoined.html @@ -2,7 +2,7 @@ -TableJoined (kafka 3.1.0 API) +TableJoined (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedCogroupedKStream.html b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedCogroupedKStream.html index 8bc2da085..10d7cb829 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedCogroupedKStream.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedCogroupedKStream.html @@ -2,7 +2,7 @@ -TimeWindowedCogroupedKStream (kafka 3.1.0 API) +TimeWindowedCogroupedKStream (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedDeserializer.html b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedDeserializer.html index 0d1d1db84..7ebd140ea 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedDeserializer.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedDeserializer.html @@ -2,7 +2,7 @@ -TimeWindowedDeserializer (kafka 3.1.0 API) +TimeWindowedDeserializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedKStream.html b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedKStream.html index bdeb9f149..c5e95b722 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedKStream.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedKStream.html @@ -2,7 +2,7 @@ -TimeWindowedKStream (kafka 3.1.0 API) +TimeWindowedKStream (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedSerializer.html b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedSerializer.html index 519ecdbf1..971d2df54 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedSerializer.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindowedSerializer.html @@ -2,7 +2,7 @@ -TimeWindowedSerializer (kafka 3.1.0 API) +TimeWindowedSerializer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindows.html b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindows.html index a0e71b083..2e38518b1 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/TimeWindows.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/TimeWindows.html @@ -2,7 +2,7 @@ -TimeWindows (kafka 3.1.0 API) +TimeWindows (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Transformer.html b/31/javadoc/org/apache/kafka/streams/kstream/Transformer.html index 3cb03222c..1afee08c9 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Transformer.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Transformer.html @@ -2,7 +2,7 @@ -Transformer (kafka 3.1.0 API) +Transformer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/TransformerSupplier.html b/31/javadoc/org/apache/kafka/streams/kstream/TransformerSupplier.html index 5ace5a78b..8ccee38b5 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/TransformerSupplier.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/TransformerSupplier.html @@ -2,7 +2,7 @@ -TransformerSupplier (kafka 3.1.0 API) +TransformerSupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/UnlimitedWindows.html b/31/javadoc/org/apache/kafka/streams/kstream/UnlimitedWindows.html index 9761f5645..b7315f17f 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/UnlimitedWindows.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/UnlimitedWindows.html @@ -2,7 +2,7 @@ -UnlimitedWindows (kafka 3.1.0 API) +UnlimitedWindows (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ValueJoiner.html b/31/javadoc/org/apache/kafka/streams/kstream/ValueJoiner.html index fd4c8756a..7756552f3 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ValueJoiner.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ValueJoiner.html @@ -2,7 +2,7 @@ -ValueJoiner (kafka 3.1.0 API) +ValueJoiner (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ValueJoinerWithKey.html b/31/javadoc/org/apache/kafka/streams/kstream/ValueJoinerWithKey.html index 0792c85ed..2e3c8f192 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ValueJoinerWithKey.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ValueJoinerWithKey.html @@ -2,7 +2,7 @@ -ValueJoinerWithKey (kafka 3.1.0 API) +ValueJoinerWithKey (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ValueMapper.html b/31/javadoc/org/apache/kafka/streams/kstream/ValueMapper.html index 0ef715823..266ca3dd1 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ValueMapper.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ValueMapper.html @@ -2,7 +2,7 @@ -ValueMapper (kafka 3.1.0 API) +ValueMapper (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ValueMapperWithKey.html b/31/javadoc/org/apache/kafka/streams/kstream/ValueMapperWithKey.html index eb740418a..6c3ff4b67 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ValueMapperWithKey.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ValueMapperWithKey.html @@ -2,7 +2,7 @@ -ValueMapperWithKey (kafka 3.1.0 API) +ValueMapperWithKey (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformer.html b/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformer.html index 826e408dd..d5d86725c 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformer.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformer.html @@ -2,7 +2,7 @@ -ValueTransformer (kafka 3.1.0 API) +ValueTransformer (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerSupplier.html b/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerSupplier.html index 067f2b8d9..a656c4ef9 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerSupplier.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerSupplier.html @@ -2,7 +2,7 @@ -ValueTransformerSupplier (kafka 3.1.0 API) +ValueTransformerSupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerWithKey.html b/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerWithKey.html index b2c0ebcad..24bdc8d20 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerWithKey.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerWithKey.html @@ -2,7 +2,7 @@ -ValueTransformerWithKey (kafka 3.1.0 API) +ValueTransformerWithKey (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerWithKeySupplier.html b/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerWithKeySupplier.html index e368e317c..672ab325b 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerWithKeySupplier.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/ValueTransformerWithKeySupplier.html @@ -2,7 +2,7 @@ -ValueTransformerWithKeySupplier (kafka 3.1.0 API) +ValueTransformerWithKeySupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Window.html b/31/javadoc/org/apache/kafka/streams/kstream/Window.html index 763ffaaed..01763e56a 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Window.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Window.html @@ -2,7 +2,7 @@ -Window (kafka 3.1.0 API) +Window (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Windowed.html b/31/javadoc/org/apache/kafka/streams/kstream/Windowed.html index a4b059ebe..acb11b75d 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Windowed.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Windowed.html @@ -2,7 +2,7 @@ -Windowed (kafka 3.1.0 API) +Windowed (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.SessionWindowedSerde.html b/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.SessionWindowedSerde.html index b2fa18fa7..e7d96a841 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.SessionWindowedSerde.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.SessionWindowedSerde.html @@ -2,7 +2,7 @@ -WindowedSerdes.SessionWindowedSerde (kafka 3.1.0 API) +WindowedSerdes.SessionWindowedSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.TimeWindowedSerde.html b/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.TimeWindowedSerde.html index 3302d6c2b..de24a24e7 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.TimeWindowedSerde.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.TimeWindowedSerde.html @@ -2,7 +2,7 @@ -WindowedSerdes.TimeWindowedSerde (kafka 3.1.0 API) +WindowedSerdes.TimeWindowedSerde (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.html b/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.html index 06ef96131..232625e86 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/WindowedSerdes.html @@ -2,7 +2,7 @@ -WindowedSerdes (kafka 3.1.0 API) +WindowedSerdes (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/Windows.html b/31/javadoc/org/apache/kafka/streams/kstream/Windows.html index 6913efe26..9c4652f91 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/Windows.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/Windows.html @@ -2,7 +2,7 @@ -Windows (kafka 3.1.0 API) +Windows (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/package-summary.html b/31/javadoc/org/apache/kafka/streams/kstream/package-summary.html index b304bd09d..b83284311 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/package-summary.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.kstream (kafka 3.1.0 API) +org.apache.kafka.streams.kstream (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/kstream/package-tree.html b/31/javadoc/org/apache/kafka/streams/kstream/package-tree.html index ab68106f1..597a63aff 100644 --- a/31/javadoc/org/apache/kafka/streams/kstream/package-tree.html +++ b/31/javadoc/org/apache/kafka/streams/kstream/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.kstream Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.streams.kstream Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/package-summary.html b/31/javadoc/org/apache/kafka/streams/package-summary.html index 10ccb617e..713fb3080 100644 --- a/31/javadoc/org/apache/kafka/streams/package-summary.html +++ b/31/javadoc/org/apache/kafka/streams/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams (kafka 3.1.0 API) +org.apache.kafka.streams (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/package-tree.html b/31/javadoc/org/apache/kafka/streams/package-tree.html index 5127d8f9d..a88cf7716 100644 --- a/31/javadoc/org/apache/kafka/streams/package-tree.html +++ b/31/javadoc/org/apache/kafka/streams/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.streams Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/AbstractProcessor.html b/31/javadoc/org/apache/kafka/streams/processor/AbstractProcessor.html index 9669585c7..b59f0d9eb 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/AbstractProcessor.html +++ b/31/javadoc/org/apache/kafka/streams/processor/AbstractProcessor.html @@ -2,7 +2,7 @@ -AbstractProcessor (kafka 3.1.0 API) +AbstractProcessor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/BatchingStateRestoreCallback.html b/31/javadoc/org/apache/kafka/streams/processor/BatchingStateRestoreCallback.html index b9d492eb9..34661054e 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/BatchingStateRestoreCallback.html +++ b/31/javadoc/org/apache/kafka/streams/processor/BatchingStateRestoreCallback.html @@ -2,7 +2,7 @@ -BatchingStateRestoreCallback (kafka 3.1.0 API) +BatchingStateRestoreCallback (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/Cancellable.html b/31/javadoc/org/apache/kafka/streams/processor/Cancellable.html index f926ce4f6..0575add95 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/Cancellable.html +++ b/31/javadoc/org/apache/kafka/streams/processor/Cancellable.html @@ -2,7 +2,7 @@ -Cancellable (kafka 3.1.0 API) +Cancellable (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/ConnectedStoreProvider.html b/31/javadoc/org/apache/kafka/streams/processor/ConnectedStoreProvider.html index a021c1abe..04db13653 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/ConnectedStoreProvider.html +++ b/31/javadoc/org/apache/kafka/streams/processor/ConnectedStoreProvider.html @@ -2,7 +2,7 @@ -ConnectedStoreProvider (kafka 3.1.0 API) +ConnectedStoreProvider (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/FailOnInvalidTimestamp.html b/31/javadoc/org/apache/kafka/streams/processor/FailOnInvalidTimestamp.html index f373fddc6..ac9af2365 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/FailOnInvalidTimestamp.html +++ b/31/javadoc/org/apache/kafka/streams/processor/FailOnInvalidTimestamp.html @@ -2,7 +2,7 @@ -FailOnInvalidTimestamp (kafka 3.1.0 API) +FailOnInvalidTimestamp (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/LogAndSkipOnInvalidTimestamp.html b/31/javadoc/org/apache/kafka/streams/processor/LogAndSkipOnInvalidTimestamp.html index a31275438..9664aef9a 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/LogAndSkipOnInvalidTimestamp.html +++ b/31/javadoc/org/apache/kafka/streams/processor/LogAndSkipOnInvalidTimestamp.html @@ -2,7 +2,7 @@ -LogAndSkipOnInvalidTimestamp (kafka 3.1.0 API) +LogAndSkipOnInvalidTimestamp (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.CapturedForward.html b/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.CapturedForward.html index 6e361dc4d..31bb9ce88 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.CapturedForward.html +++ b/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.CapturedForward.html @@ -2,7 +2,7 @@ -MockProcessorContext.CapturedForward (kafka 3.1.0 API) +MockProcessorContext.CapturedForward (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.CapturedPunctuator.html b/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.CapturedPunctuator.html index b92887824..d079310cf 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.CapturedPunctuator.html +++ b/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.CapturedPunctuator.html @@ -2,7 +2,7 @@ -MockProcessorContext.CapturedPunctuator (kafka 3.1.0 API) +MockProcessorContext.CapturedPunctuator (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.html b/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.html index a9e88beaa..f782fbf72 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.html +++ b/31/javadoc/org/apache/kafka/streams/processor/MockProcessorContext.html @@ -2,7 +2,7 @@ -MockProcessorContext (kafka 3.1.0 API) +MockProcessorContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/Processor.html b/31/javadoc/org/apache/kafka/streams/processor/Processor.html index 019f7cb55..06a5478e3 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/Processor.html +++ b/31/javadoc/org/apache/kafka/streams/processor/Processor.html @@ -2,7 +2,7 @@ -Processor (kafka 3.1.0 API) +Processor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/ProcessorContext.html b/31/javadoc/org/apache/kafka/streams/processor/ProcessorContext.html index 8693aa7f1..44ff1f5b5 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/ProcessorContext.html +++ b/31/javadoc/org/apache/kafka/streams/processor/ProcessorContext.html @@ -2,7 +2,7 @@ -ProcessorContext (kafka 3.1.0 API) +ProcessorContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/ProcessorSupplier.html b/31/javadoc/org/apache/kafka/streams/processor/ProcessorSupplier.html index 9058c436a..11ad969f6 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/ProcessorSupplier.html +++ b/31/javadoc/org/apache/kafka/streams/processor/ProcessorSupplier.html @@ -2,7 +2,7 @@ -ProcessorSupplier (kafka 3.1.0 API) +ProcessorSupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/PunctuationType.html b/31/javadoc/org/apache/kafka/streams/processor/PunctuationType.html index 4dd651b6f..9ef23fec4 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/PunctuationType.html +++ b/31/javadoc/org/apache/kafka/streams/processor/PunctuationType.html @@ -2,7 +2,7 @@ -PunctuationType (kafka 3.1.0 API) +PunctuationType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/Punctuator.html b/31/javadoc/org/apache/kafka/streams/processor/Punctuator.html index 833b29099..d7538e096 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/Punctuator.html +++ b/31/javadoc/org/apache/kafka/streams/processor/Punctuator.html @@ -2,7 +2,7 @@ -Punctuator (kafka 3.1.0 API) +Punctuator (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/RecordContext.html b/31/javadoc/org/apache/kafka/streams/processor/RecordContext.html index 260ef957a..03dc230b5 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/RecordContext.html +++ b/31/javadoc/org/apache/kafka/streams/processor/RecordContext.html @@ -2,7 +2,7 @@ -RecordContext (kafka 3.1.0 API) +RecordContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/StateRestoreCallback.html b/31/javadoc/org/apache/kafka/streams/processor/StateRestoreCallback.html index 8d1ca0845..526d9b164 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/StateRestoreCallback.html +++ b/31/javadoc/org/apache/kafka/streams/processor/StateRestoreCallback.html @@ -2,7 +2,7 @@ -StateRestoreCallback (kafka 3.1.0 API) +StateRestoreCallback (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/StateRestoreListener.html b/31/javadoc/org/apache/kafka/streams/processor/StateRestoreListener.html index d6185058e..76e3d977a 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/StateRestoreListener.html +++ b/31/javadoc/org/apache/kafka/streams/processor/StateRestoreListener.html @@ -2,7 +2,7 @@ -StateRestoreListener (kafka 3.1.0 API) +StateRestoreListener (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/StateStore.html b/31/javadoc/org/apache/kafka/streams/processor/StateStore.html index 102227072..23b924fd5 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/StateStore.html +++ b/31/javadoc/org/apache/kafka/streams/processor/StateStore.html @@ -2,7 +2,7 @@ -StateStore (kafka 3.1.0 API) +StateStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/StateStoreContext.html b/31/javadoc/org/apache/kafka/streams/processor/StateStoreContext.html index 70ba86bce..ad7cb61cc 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/StateStoreContext.html +++ b/31/javadoc/org/apache/kafka/streams/processor/StateStoreContext.html @@ -2,7 +2,7 @@ -StateStoreContext (kafka 3.1.0 API) +StateStoreContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/StreamPartitioner.html b/31/javadoc/org/apache/kafka/streams/processor/StreamPartitioner.html index 5bb89f6d6..7646ade4c 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/StreamPartitioner.html +++ b/31/javadoc/org/apache/kafka/streams/processor/StreamPartitioner.html @@ -2,7 +2,7 @@ -StreamPartitioner (kafka 3.1.0 API) +StreamPartitioner (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/TaskId.html b/31/javadoc/org/apache/kafka/streams/processor/TaskId.html index 51a9575ad..22b81db01 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/TaskId.html +++ b/31/javadoc/org/apache/kafka/streams/processor/TaskId.html @@ -2,7 +2,7 @@ -TaskId (kafka 3.1.0 API) +TaskId (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/TaskMetadata.html b/31/javadoc/org/apache/kafka/streams/processor/TaskMetadata.html index ea8afcfd3..d5c44ba52 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/TaskMetadata.html +++ b/31/javadoc/org/apache/kafka/streams/processor/TaskMetadata.html @@ -2,7 +2,7 @@ -TaskMetadata (kafka 3.1.0 API) +TaskMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/ThreadMetadata.html b/31/javadoc/org/apache/kafka/streams/processor/ThreadMetadata.html index 7c90f073e..d1b8b2088 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/ThreadMetadata.html +++ b/31/javadoc/org/apache/kafka/streams/processor/ThreadMetadata.html @@ -2,7 +2,7 @@ -ThreadMetadata (kafka 3.1.0 API) +ThreadMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/TimestampExtractor.html b/31/javadoc/org/apache/kafka/streams/processor/TimestampExtractor.html index 3c019c8d7..4ce50c79e 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/TimestampExtractor.html +++ b/31/javadoc/org/apache/kafka/streams/processor/TimestampExtractor.html @@ -2,7 +2,7 @@ -TimestampExtractor (kafka 3.1.0 API) +TimestampExtractor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/To.html b/31/javadoc/org/apache/kafka/streams/processor/To.html index d0537d1ee..d62f230a1 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/To.html +++ b/31/javadoc/org/apache/kafka/streams/processor/To.html @@ -2,7 +2,7 @@ -To (kafka 3.1.0 API) +To (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/TopicNameExtractor.html b/31/javadoc/org/apache/kafka/streams/processor/TopicNameExtractor.html index c9ada308b..56d53c071 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/TopicNameExtractor.html +++ b/31/javadoc/org/apache/kafka/streams/processor/TopicNameExtractor.html @@ -2,7 +2,7 @@ -TopicNameExtractor (kafka 3.1.0 API) +TopicNameExtractor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/UsePartitionTimeOnInvalidTimestamp.html b/31/javadoc/org/apache/kafka/streams/processor/UsePartitionTimeOnInvalidTimestamp.html index abb48f9dd..db34393fc 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/UsePartitionTimeOnInvalidTimestamp.html +++ b/31/javadoc/org/apache/kafka/streams/processor/UsePartitionTimeOnInvalidTimestamp.html @@ -2,7 +2,7 @@ -UsePartitionTimeOnInvalidTimestamp (kafka 3.1.0 API) +UsePartitionTimeOnInvalidTimestamp (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/WallclockTimestampExtractor.html b/31/javadoc/org/apache/kafka/streams/processor/WallclockTimestampExtractor.html index 9437805cc..2eef1f82e 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/WallclockTimestampExtractor.html +++ b/31/javadoc/org/apache/kafka/streams/processor/WallclockTimestampExtractor.html @@ -2,7 +2,7 @@ -WallclockTimestampExtractor (kafka 3.1.0 API) +WallclockTimestampExtractor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/ContextualProcessor.html b/31/javadoc/org/apache/kafka/streams/processor/api/ContextualProcessor.html index 15c615792..d2325920e 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/ContextualProcessor.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/ContextualProcessor.html @@ -2,7 +2,7 @@ -ContextualProcessor (kafka 3.1.0 API) +ContextualProcessor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.CapturedForward.html b/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.CapturedForward.html index 02d31da99..df461d33e 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.CapturedForward.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.CapturedForward.html @@ -2,7 +2,7 @@ -MockProcessorContext.CapturedForward (kafka 3.1.0 API) +MockProcessorContext.CapturedForward (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.CapturedPunctuator.html b/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.CapturedPunctuator.html index 80b76c575..eab5029e2 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.CapturedPunctuator.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.CapturedPunctuator.html @@ -2,7 +2,7 @@ -MockProcessorContext.CapturedPunctuator (kafka 3.1.0 API) +MockProcessorContext.CapturedPunctuator (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.html b/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.html index 042f4a2ed..c40242da7 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/MockProcessorContext.html @@ -2,7 +2,7 @@ -MockProcessorContext (kafka 3.1.0 API) +MockProcessorContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/Processor.html b/31/javadoc/org/apache/kafka/streams/processor/api/Processor.html index 5286ae6b4..bc657eddb 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/Processor.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/Processor.html @@ -2,7 +2,7 @@ -Processor (kafka 3.1.0 API) +Processor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/ProcessorContext.html b/31/javadoc/org/apache/kafka/streams/processor/api/ProcessorContext.html index a9f515061..6acd33e0d 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/ProcessorContext.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/ProcessorContext.html @@ -2,7 +2,7 @@ -ProcessorContext (kafka 3.1.0 API) +ProcessorContext (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/ProcessorSupplier.html b/31/javadoc/org/apache/kafka/streams/processor/api/ProcessorSupplier.html index fb1449f43..0e2402858 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/ProcessorSupplier.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/ProcessorSupplier.html @@ -2,7 +2,7 @@ -ProcessorSupplier (kafka 3.1.0 API) +ProcessorSupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/Record.html b/31/javadoc/org/apache/kafka/streams/processor/api/Record.html index 53713b4e8..58c6a69f8 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/Record.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/Record.html @@ -2,7 +2,7 @@ -Record (kafka 3.1.0 API) +Record (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/RecordMetadata.html b/31/javadoc/org/apache/kafka/streams/processor/api/RecordMetadata.html index ffd8dd3a9..e36a59b73 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/RecordMetadata.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/RecordMetadata.html @@ -2,7 +2,7 @@ -RecordMetadata (kafka 3.1.0 API) +RecordMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/package-summary.html b/31/javadoc/org/apache/kafka/streams/processor/api/package-summary.html index 4488c6397..2187acc21 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/package-summary.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.processor.api (kafka 3.1.0 API) +org.apache.kafka.streams.processor.api (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/api/package-tree.html b/31/javadoc/org/apache/kafka/streams/processor/api/package-tree.html index 0cb001616..50f872bd3 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/api/package-tree.html +++ b/31/javadoc/org/apache/kafka/streams/processor/api/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.processor.api Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.streams.processor.api Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/package-summary.html b/31/javadoc/org/apache/kafka/streams/processor/package-summary.html index 15882d6c2..d39df5aed 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/package-summary.html +++ b/31/javadoc/org/apache/kafka/streams/processor/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.processor (kafka 3.1.0 API) +org.apache.kafka.streams.processor (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/processor/package-tree.html b/31/javadoc/org/apache/kafka/streams/processor/package-tree.html index 2ec37b1f5..ced1b66f7 100644 --- a/31/javadoc/org/apache/kafka/streams/processor/package-tree.html +++ b/31/javadoc/org/apache/kafka/streams/processor/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.processor Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.streams.processor Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/HostInfo.html b/31/javadoc/org/apache/kafka/streams/state/HostInfo.html index 7e13bf0d3..82dd07e97 100644 --- a/31/javadoc/org/apache/kafka/streams/state/HostInfo.html +++ b/31/javadoc/org/apache/kafka/streams/state/HostInfo.html @@ -2,7 +2,7 @@ -HostInfo (kafka 3.1.0 API) +HostInfo (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/KeyValueBytesStoreSupplier.html b/31/javadoc/org/apache/kafka/streams/state/KeyValueBytesStoreSupplier.html index 84a67cbe3..da9a3a742 100644 --- a/31/javadoc/org/apache/kafka/streams/state/KeyValueBytesStoreSupplier.html +++ b/31/javadoc/org/apache/kafka/streams/state/KeyValueBytesStoreSupplier.html @@ -2,7 +2,7 @@ -KeyValueBytesStoreSupplier (kafka 3.1.0 API) +KeyValueBytesStoreSupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/KeyValueIterator.html b/31/javadoc/org/apache/kafka/streams/state/KeyValueIterator.html index d1d5bf98c..8f185b7ca 100644 --- a/31/javadoc/org/apache/kafka/streams/state/KeyValueIterator.html +++ b/31/javadoc/org/apache/kafka/streams/state/KeyValueIterator.html @@ -2,7 +2,7 @@ -KeyValueIterator (kafka 3.1.0 API) +KeyValueIterator (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/KeyValueStore.html b/31/javadoc/org/apache/kafka/streams/state/KeyValueStore.html index 8e07a5f6f..ba758a405 100644 --- a/31/javadoc/org/apache/kafka/streams/state/KeyValueStore.html +++ b/31/javadoc/org/apache/kafka/streams/state/KeyValueStore.html @@ -2,7 +2,7 @@ -KeyValueStore (kafka 3.1.0 API) +KeyValueStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreType.html b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreType.html index a2af118e2..24b2b847d 100644 --- a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreType.html +++ b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreType.html @@ -2,7 +2,7 @@ -QueryableStoreType (kafka 3.1.0 API) +QueryableStoreType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.KeyValueStoreType.html b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.KeyValueStoreType.html index 745855a9c..8167066b7 100644 --- a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.KeyValueStoreType.html +++ b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.KeyValueStoreType.html @@ -2,7 +2,7 @@ -QueryableStoreTypes.KeyValueStoreType (kafka 3.1.0 API) +QueryableStoreTypes.KeyValueStoreType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.SessionStoreType.html b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.SessionStoreType.html index 7ca4a7ffc..bf3c2d6dc 100644 --- a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.SessionStoreType.html +++ b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.SessionStoreType.html @@ -2,7 +2,7 @@ -QueryableStoreTypes.SessionStoreType (kafka 3.1.0 API) +QueryableStoreTypes.SessionStoreType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.WindowStoreType.html b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.WindowStoreType.html index 533d96d6c..b2af7a080 100644 --- a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.WindowStoreType.html +++ b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.WindowStoreType.html @@ -2,7 +2,7 @@ -QueryableStoreTypes.WindowStoreType (kafka 3.1.0 API) +QueryableStoreTypes.WindowStoreType (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.html b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.html index 924b9f476..83a0535bd 100644 --- a/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.html +++ b/31/javadoc/org/apache/kafka/streams/state/QueryableStoreTypes.html @@ -2,7 +2,7 @@ -QueryableStoreTypes (kafka 3.1.0 API) +QueryableStoreTypes (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/ReadOnlyKeyValueStore.html b/31/javadoc/org/apache/kafka/streams/state/ReadOnlyKeyValueStore.html index 8ff54d2d9..2f9458177 100644 --- a/31/javadoc/org/apache/kafka/streams/state/ReadOnlyKeyValueStore.html +++ b/31/javadoc/org/apache/kafka/streams/state/ReadOnlyKeyValueStore.html @@ -2,7 +2,7 @@ -ReadOnlyKeyValueStore (kafka 3.1.0 API) +ReadOnlyKeyValueStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/ReadOnlySessionStore.html b/31/javadoc/org/apache/kafka/streams/state/ReadOnlySessionStore.html index ce218951d..08adb93e3 100644 --- a/31/javadoc/org/apache/kafka/streams/state/ReadOnlySessionStore.html +++ b/31/javadoc/org/apache/kafka/streams/state/ReadOnlySessionStore.html @@ -2,7 +2,7 @@ -ReadOnlySessionStore (kafka 3.1.0 API) +ReadOnlySessionStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/ReadOnlyWindowStore.html b/31/javadoc/org/apache/kafka/streams/state/ReadOnlyWindowStore.html index 452b3d4ea..e80b003a4 100644 --- a/31/javadoc/org/apache/kafka/streams/state/ReadOnlyWindowStore.html +++ b/31/javadoc/org/apache/kafka/streams/state/ReadOnlyWindowStore.html @@ -2,7 +2,7 @@ -ReadOnlyWindowStore (kafka 3.1.0 API) +ReadOnlyWindowStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/RocksDBConfigSetter.html b/31/javadoc/org/apache/kafka/streams/state/RocksDBConfigSetter.html index e64f4df3a..8387e4552 100644 --- a/31/javadoc/org/apache/kafka/streams/state/RocksDBConfigSetter.html +++ b/31/javadoc/org/apache/kafka/streams/state/RocksDBConfigSetter.html @@ -2,7 +2,7 @@ -RocksDBConfigSetter (kafka 3.1.0 API) +RocksDBConfigSetter (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/SessionBytesStoreSupplier.html b/31/javadoc/org/apache/kafka/streams/state/SessionBytesStoreSupplier.html index 9a7b903ee..00c2206c8 100644 --- a/31/javadoc/org/apache/kafka/streams/state/SessionBytesStoreSupplier.html +++ b/31/javadoc/org/apache/kafka/streams/state/SessionBytesStoreSupplier.html @@ -2,7 +2,7 @@ -SessionBytesStoreSupplier (kafka 3.1.0 API) +SessionBytesStoreSupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/SessionStore.html b/31/javadoc/org/apache/kafka/streams/state/SessionStore.html index aaa78a532..c55099d90 100644 --- a/31/javadoc/org/apache/kafka/streams/state/SessionStore.html +++ b/31/javadoc/org/apache/kafka/streams/state/SessionStore.html @@ -2,7 +2,7 @@ -SessionStore (kafka 3.1.0 API) +SessionStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/StateSerdes.html b/31/javadoc/org/apache/kafka/streams/state/StateSerdes.html index 0ce2dc9be..0fcc513c9 100644 --- a/31/javadoc/org/apache/kafka/streams/state/StateSerdes.html +++ b/31/javadoc/org/apache/kafka/streams/state/StateSerdes.html @@ -2,7 +2,7 @@ -StateSerdes (kafka 3.1.0 API) +StateSerdes (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/StoreBuilder.html b/31/javadoc/org/apache/kafka/streams/state/StoreBuilder.html index 28f4d1b1c..fc36ab1bc 100644 --- a/31/javadoc/org/apache/kafka/streams/state/StoreBuilder.html +++ b/31/javadoc/org/apache/kafka/streams/state/StoreBuilder.html @@ -2,7 +2,7 @@ -StoreBuilder (kafka 3.1.0 API) +StoreBuilder (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/StoreSupplier.html b/31/javadoc/org/apache/kafka/streams/state/StoreSupplier.html index d40c27337..254b70e6a 100644 --- a/31/javadoc/org/apache/kafka/streams/state/StoreSupplier.html +++ b/31/javadoc/org/apache/kafka/streams/state/StoreSupplier.html @@ -2,7 +2,7 @@ -StoreSupplier (kafka 3.1.0 API) +StoreSupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/Stores.html b/31/javadoc/org/apache/kafka/streams/state/Stores.html index bf460fec0..632841438 100644 --- a/31/javadoc/org/apache/kafka/streams/state/Stores.html +++ b/31/javadoc/org/apache/kafka/streams/state/Stores.html @@ -2,7 +2,7 @@ -Stores (kafka 3.1.0 API) +Stores (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/StreamsMetadata.html b/31/javadoc/org/apache/kafka/streams/state/StreamsMetadata.html index 88fce99fa..69fd7e950 100644 --- a/31/javadoc/org/apache/kafka/streams/state/StreamsMetadata.html +++ b/31/javadoc/org/apache/kafka/streams/state/StreamsMetadata.html @@ -2,7 +2,7 @@ -StreamsMetadata (kafka 3.1.0 API) +StreamsMetadata (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/TimestampedBytesStore.html b/31/javadoc/org/apache/kafka/streams/state/TimestampedBytesStore.html index 23b504575..062196230 100644 --- a/31/javadoc/org/apache/kafka/streams/state/TimestampedBytesStore.html +++ b/31/javadoc/org/apache/kafka/streams/state/TimestampedBytesStore.html @@ -2,7 +2,7 @@ -TimestampedBytesStore (kafka 3.1.0 API) +TimestampedBytesStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/TimestampedKeyValueStore.html b/31/javadoc/org/apache/kafka/streams/state/TimestampedKeyValueStore.html index 61057b89e..7575a1552 100644 --- a/31/javadoc/org/apache/kafka/streams/state/TimestampedKeyValueStore.html +++ b/31/javadoc/org/apache/kafka/streams/state/TimestampedKeyValueStore.html @@ -2,7 +2,7 @@ -TimestampedKeyValueStore (kafka 3.1.0 API) +TimestampedKeyValueStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/TimestampedWindowStore.html b/31/javadoc/org/apache/kafka/streams/state/TimestampedWindowStore.html index c6100efd2..8ee8fce56 100644 --- a/31/javadoc/org/apache/kafka/streams/state/TimestampedWindowStore.html +++ b/31/javadoc/org/apache/kafka/streams/state/TimestampedWindowStore.html @@ -2,7 +2,7 @@ -TimestampedWindowStore (kafka 3.1.0 API) +TimestampedWindowStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/ValueAndTimestamp.html b/31/javadoc/org/apache/kafka/streams/state/ValueAndTimestamp.html index 48b64f211..80af601fe 100644 --- a/31/javadoc/org/apache/kafka/streams/state/ValueAndTimestamp.html +++ b/31/javadoc/org/apache/kafka/streams/state/ValueAndTimestamp.html @@ -2,7 +2,7 @@ -ValueAndTimestamp (kafka 3.1.0 API) +ValueAndTimestamp (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/WindowBytesStoreSupplier.html b/31/javadoc/org/apache/kafka/streams/state/WindowBytesStoreSupplier.html index e6604b65b..a613da45d 100644 --- a/31/javadoc/org/apache/kafka/streams/state/WindowBytesStoreSupplier.html +++ b/31/javadoc/org/apache/kafka/streams/state/WindowBytesStoreSupplier.html @@ -2,7 +2,7 @@ -WindowBytesStoreSupplier (kafka 3.1.0 API) +WindowBytesStoreSupplier (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/WindowStore.html b/31/javadoc/org/apache/kafka/streams/state/WindowStore.html index 339ac610b..e88f3237c 100644 --- a/31/javadoc/org/apache/kafka/streams/state/WindowStore.html +++ b/31/javadoc/org/apache/kafka/streams/state/WindowStore.html @@ -2,7 +2,7 @@ -WindowStore (kafka 3.1.0 API) +WindowStore (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/WindowStoreIterator.html b/31/javadoc/org/apache/kafka/streams/state/WindowStoreIterator.html index a19be3f31..c18c505d0 100644 --- a/31/javadoc/org/apache/kafka/streams/state/WindowStoreIterator.html +++ b/31/javadoc/org/apache/kafka/streams/state/WindowStoreIterator.html @@ -2,7 +2,7 @@ -WindowStoreIterator (kafka 3.1.0 API) +WindowStoreIterator (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/package-summary.html b/31/javadoc/org/apache/kafka/streams/state/package-summary.html index 65e03ada2..dde0198c6 100644 --- a/31/javadoc/org/apache/kafka/streams/state/package-summary.html +++ b/31/javadoc/org/apache/kafka/streams/state/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.state (kafka 3.1.0 API) +org.apache.kafka.streams.state (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/state/package-tree.html b/31/javadoc/org/apache/kafka/streams/state/package-tree.html index 620cc09e7..ea881ef27 100644 --- a/31/javadoc/org/apache/kafka/streams/state/package-tree.html +++ b/31/javadoc/org/apache/kafka/streams/state/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.state Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.streams.state Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/test/TestRecord.html b/31/javadoc/org/apache/kafka/streams/test/TestRecord.html index bfb76542c..b13622ee9 100644 --- a/31/javadoc/org/apache/kafka/streams/test/TestRecord.html +++ b/31/javadoc/org/apache/kafka/streams/test/TestRecord.html @@ -2,7 +2,7 @@ -TestRecord (kafka 3.1.0 API) +TestRecord (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/test/package-summary.html b/31/javadoc/org/apache/kafka/streams/test/package-summary.html index 9bf78ef61..cfb86a9e3 100644 --- a/31/javadoc/org/apache/kafka/streams/test/package-summary.html +++ b/31/javadoc/org/apache/kafka/streams/test/package-summary.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.test (kafka 3.1.0 API) +org.apache.kafka.streams.test (kafka 3.1.1 API) diff --git a/31/javadoc/org/apache/kafka/streams/test/package-tree.html b/31/javadoc/org/apache/kafka/streams/test/package-tree.html index b33c086e5..2b77494cc 100644 --- a/31/javadoc/org/apache/kafka/streams/test/package-tree.html +++ b/31/javadoc/org/apache/kafka/streams/test/package-tree.html @@ -2,7 +2,7 @@ -org.apache.kafka.streams.test Class Hierarchy (kafka 3.1.0 API) +org.apache.kafka.streams.test Class Hierarchy (kafka 3.1.1 API) diff --git a/31/javadoc/overview-summary.html b/31/javadoc/overview-summary.html index bd0014607..2014b07e8 100644 --- a/31/javadoc/overview-summary.html +++ b/31/javadoc/overview-summary.html @@ -2,7 +2,7 @@ -kafka 3.1.0 API +kafka 3.1.1 API diff --git a/31/javadoc/overview-tree.html b/31/javadoc/overview-tree.html index 882e9e795..5b724a8d6 100644 --- a/31/javadoc/overview-tree.html +++ b/31/javadoc/overview-tree.html @@ -2,7 +2,7 @@ -Class Hierarchy (kafka 3.1.0 API) +Class Hierarchy (kafka 3.1.1 API) @@ -67,6 +67,7 @@

    Hierarchy For All Packages

  • org.apache.kafka.common.resource,
  • org.apache.kafka.common.security.auth,
  • org.apache.kafka.common.security.oauthbearer,
  • +
  • org.apache.kafka.common.security.oauthbearer.secured,
  • org.apache.kafka.common.security.plain,
  • org.apache.kafka.common.security.scram,
  • org.apache.kafka.common.security.token.delegation,
  • @@ -186,6 +187,8 @@

    Class Hierarchy

  • org.apache.kafka.common.acl.AccessControlEntry
  • org.apache.kafka.common.acl.AccessControlEntryFilter
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.AccessTokenRetrieverFactory
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.AccessTokenValidatorFactory
  • org.apache.kafka.common.acl.AclBinding
  • org.apache.kafka.common.acl.AclBindingFilter
  • org.apache.kafka.server.authorizer.AclCreateResult
  • @@ -205,6 +208,7 @@

    Class Hierarchy

  • org.apache.kafka.clients.admin.AlterPartitionReassignmentsResult
  • org.apache.kafka.clients.admin.AlterReplicaLogDirsResult
  • org.apache.kafka.clients.admin.AlterUserScramCredentialsResult
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.BasicOAuthBearerToken (implements org.apache.kafka.common.security.oauthbearer.OAuthBearerToken)
  • org.apache.kafka.streams.kstream.Branched<K,V>
  • org.apache.kafka.common.serialization.ByteArrayDeserializer (implements org.apache.kafka.common.serialization.Deserializer<T>)
  • org.apache.kafka.common.serialization.ByteArraySerializer (implements org.apache.kafka.common.serialization.Serializer<T>)
  • @@ -213,6 +217,7 @@

    Class Hierarchy

  • org.apache.kafka.common.serialization.BytesDeserializer (implements org.apache.kafka.common.serialization.Deserializer<T>)
  • org.apache.kafka.common.serialization.BytesSerializer (implements org.apache.kafka.common.serialization.Serializer<T>)
  • org.apache.kafka.connect.mirror.Checkpoint
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.ClaimValidationUtils
  • org.apache.kafka.common.quota.ClientQuotaAlteration
  • org.apache.kafka.common.quota.ClientQuotaAlteration.Op
  • org.apache.kafka.common.quota.ClientQuotaEntity
  • @@ -240,6 +245,7 @@

    Class Hierarchy

  • org.apache.kafka.common.config.ConfigResource
  • org.apache.kafka.common.config.ConfigTransformer
  • org.apache.kafka.common.config.ConfigTransformerResult
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.ConfigurationUtils
  • org.apache.kafka.common.config.ConfigValue
  • org.apache.kafka.connect.header.ConnectHeaders (implements org.apache.kafka.connect.header.Headers)
  • org.apache.kafka.connect.connector.Connector (implements org.apache.kafka.connect.components.Versioned) @@ -323,6 +329,7 @@

    Class Hierarchy

  • org.apache.kafka.clients.admin.FeatureUpdate
  • org.apache.kafka.connect.data.Field
  • org.apache.kafka.common.config.provider.FileConfigProvider (implements org.apache.kafka.common.config.provider.ConfigProvider)
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.FileTokenRetriever (implements org.apache.kafka.common.security.oauthbearer.secured.AccessTokenRetriever)
  • org.apache.kafka.clients.admin.FinalizedVersionRange
  • org.apache.kafka.common.serialization.FloatDeserializer (implements org.apache.kafka.common.serialization.Deserializer<T>)
  • org.apache.kafka.common.serialization.FloatSerializer (implements org.apache.kafka.common.serialization.Serializer<T>)
  • @@ -334,11 +341,14 @@

    Class Hierarchy

  • org.apache.kafka.common.metrics.stats.Histogram.ConstantBinScheme (implements org.apache.kafka.common.metrics.stats.Histogram.BinScheme)
  • org.apache.kafka.common.metrics.stats.Histogram.LinearBinScheme (implements org.apache.kafka.common.metrics.stats.Histogram.BinScheme)
  • org.apache.kafka.streams.state.HostInfo
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.HttpAccessTokenRetriever (implements org.apache.kafka.common.security.oauthbearer.secured.AccessTokenRetriever)
  • org.apache.kafka.common.serialization.IntegerDeserializer (implements org.apache.kafka.common.serialization.Deserializer<T>)
  • org.apache.kafka.common.serialization.IntegerSerializer (implements org.apache.kafka.common.serialization.Serializer<T>)
  • org.apache.kafka.common.annotation.InterfaceStability
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.JaasOptionsUtils
  • org.apache.kafka.common.metrics.JmxReporter (implements org.apache.kafka.common.metrics.MetricsReporter)
  • org.apache.kafka.streams.kstream.Joined<K,V,VO>
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.JwksFileVerificationKeyResolver (implements org.apache.kafka.common.security.oauthbearer.secured.CloseableVerificationKeyResolver)
  • org.apache.kafka.clients.consumer.KafkaConsumer<K,V> (implements org.apache.kafka.clients.consumer.Consumer<K,V>)
  • org.apache.kafka.common.KafkaFuture<T> (implements java.util.concurrent.Future<V>)
  • org.apache.kafka.common.KafkaFuture.Function<A,B> (implements org.apache.kafka.common.KafkaFuture.BaseFunction<A,B>)
  • @@ -363,6 +373,7 @@

    Class Hierarchy

  • org.apache.kafka.streams.errors.LogAndFailExceptionHandler (implements org.apache.kafka.streams.errors.DeserializationExceptionHandler)
  • org.apache.kafka.streams.processor.LogAndSkipOnInvalidTimestamp
  • org.apache.kafka.clients.admin.LogDirDescription
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.LoginAccessTokenValidator (implements org.apache.kafka.common.security.oauthbearer.secured.AccessTokenValidator)
  • org.apache.kafka.common.config.LogLevelConfig
  • org.apache.kafka.server.log.remote.storage.LogSegmentData
  • org.apache.kafka.common.serialization.LongDeserializer (implements org.apache.kafka.common.serialization.Deserializer<T>)
  • @@ -391,9 +402,11 @@

    Class Hierarchy

  • org.apache.kafka.clients.admin.NewTopic
  • org.apache.kafka.common.Node
  • org.apache.kafka.common.security.oauthbearer.OAuthBearerExtensionsValidatorCallback (implements javax.security.auth.callback.Callback)
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler (implements org.apache.kafka.common.security.auth.AuthenticateCallbackHandler)
  • org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule (implements javax.security.auth.spi.LoginModule)
  • org.apache.kafka.common.security.oauthbearer.OAuthBearerTokenCallback (implements javax.security.auth.callback.Callback)
  • org.apache.kafka.common.security.oauthbearer.OAuthBearerValidatorCallback (implements javax.security.auth.callback.Callback)
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerValidatorCallbackHandler (implements org.apache.kafka.common.security.auth.AuthenticateCallbackHandler)
  • org.apache.kafka.clients.consumer.OffsetAndMetadata (implements java.io.Serializable)
  • org.apache.kafka.clients.consumer.OffsetAndTimestamp
  • org.apache.kafka.clients.admin.OffsetSpec @@ -427,6 +440,8 @@

    Class Hierarchy

  • org.apache.kafka.streams.processor.api.Record<K,V>
  • org.apache.kafka.clients.producer.RecordMetadata
  • org.apache.kafka.clients.admin.RecordsToDelete
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.RefreshingHttpsJwks (implements java.io.Closeable, org.apache.kafka.common.security.oauthbearer.secured.Initable)
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.RefreshingHttpsJwksVerificationKeyResolver (implements org.apache.kafka.common.security.oauthbearer.secured.CloseableVerificationKeyResolver)
  • org.apache.kafka.connect.mirror.RemoteClusterUtils
  • org.apache.kafka.server.log.remote.storage.RemoteLogMetadata
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.UnretryableException
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.ValidateException
  • org.apache.kafka.common.errors.WakeupException
  • @@ -785,11 +804,13 @@

    Class Hierarchy

  • org.apache.kafka.common.Uuid (implements java.lang.Comparable<T>)
  • org.apache.kafka.common.serialization.UUIDDeserializer (implements org.apache.kafka.common.serialization.Deserializer<T>)
  • org.apache.kafka.common.serialization.UUIDSerializer (implements org.apache.kafka.common.serialization.Serializer<T>)
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.ValidatorAccessTokenValidator (implements org.apache.kafka.common.security.oauthbearer.secured.AccessTokenValidator)
  • org.apache.kafka.common.metrics.stats.Value (implements org.apache.kafka.common.metrics.MeasurableStat)
  • org.apache.kafka.streams.state.ValueAndTimestamp<V>
  • org.apache.kafka.connect.data.Values
  • org.apache.kafka.connect.data.Values.Parser
  • org.apache.kafka.connect.data.Values.SchemaDetector
  • +
  • org.apache.kafka.common.security.oauthbearer.secured.VerificationKeyResolverFactory
  • org.apache.kafka.common.serialization.VoidDeserializer (implements org.apache.kafka.common.serialization.Deserializer<T>)
  • org.apache.kafka.common.serialization.VoidSerializer (implements org.apache.kafka.common.serialization.Serializer<T>)
  • org.apache.kafka.streams.processor.WallclockTimestampExtractor (implements org.apache.kafka.streams.processor.TimestampExtractor)
  • @@ -810,6 +831,7 @@

    Class Hierarchy

    Interface Hierarchy