Skip to content

Commit

Permalink
Fix misspellings (elastic#19981)
Browse files Browse the repository at this point in the history
  • Loading branch information
jsoref authored and cjcenizal committed Jun 27, 2018
1 parent ad44ae9 commit 2b27fb1
Show file tree
Hide file tree
Showing 472 changed files with 755 additions and 754 deletions.
2 changes: 1 addition & 1 deletion .eslintrc.js
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,7 @@ module.exports = {

/**
* Files that require Apache 2.0 headers, settings
* are overriden below for files that require Elastic
* are overridden below for files that require Elastic
* Licence headers
*/
{
Expand Down
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ You want to make sure there are no merge conflicts. If there are merge conflicts

You can use `git status` to see which files contain conflicts. They'll be the ones that aren't staged for commit. Open those files, and look for where git has marked the conflicts. Resolve the conflicts so that the changes you want to make to the code have been incorporated in a way that doesn't destroy work that's been done in master. Refer to master's commit history on GitHub if you need to gain a better understanding of how code is conflicting and how best to resolve it.

Once you've resolved all of the merge conflicts, use `git add -A` to stage them to be commiteed, and then use `git rebase --continue` to tell git to continue the rebase.
Once you've resolved all of the merge conflicts, use `git add -A` to stage them to be committed, and then use `git rebase --continue` to tell git to continue the rebase.

When the rebase has completed, you will need to force push your branch because the history is now completely different than what's on the remote. **This is potentially dangerous** because it will completely overwrite what you have on the remote, so you need to be sure that you haven't lost any work when resolving merge conflicts. (If there weren't any merge conflicts, then you can force push without having to worry about this.)

Expand Down Expand Up @@ -404,7 +404,7 @@ Please make sure you have signed the [Contributor License Agreement](http://www.

## Submitting a Pull Request

Push your local changes to your forked copy of the repository and submit a Pull Request. In the Pull Request, describe what your changes do and mention the number of the issue where discussion has taken place, eg “Closes #123″.
Push your local changes to your forked copy of the repository and submit a Pull Request. In the Pull Request, describe what your changes do and mention the number of the issue where discussion has taken place, e.g., “Closes #123″.

Always submit your pull against `master` unless the bug is only present in an older version. If the bug affects both `master` and another branch say so in your pull.

Expand Down
2 changes: 1 addition & 1 deletion docs/dashboard.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ image:images/Dashboard_visualization_data.png[Example of visualization data]
To export the visualization data as a comma separated values
(CSV) file, click *Raw* or *Formatted* at the bottom of the data
table. *Raw* exports the response data as provided. *Formatted*
exports the reponse data using applicable Kibana <<managing-fields,field
exports the response data using applicable Kibana <<managing-fields,field
formatters>>.

To return to the visualization, click the *Collapse* button in the lower left
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ The `FunctionalTestRunner` automatically transpiles functional tests using babel

Code run by the `FunctionalTestRunner` is wrapped in a function so it can be passed around via config files and be parameterized. Any of these Provider functions may be asynchronous and should return/resolve-to the value they are meant to _provide_. Provider functions will always be called with a single argument: a provider API (see the <<functional_test_runner_provider_api,Provider API Section>>).

A config provder:
A config provider:

["source","js"]
-----------
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ export default async function ({ readConfigFile }) {
// define the name and providers for services that should be
// available to your tests. If you don't specify anything here
// only the built-in services will be avaliable
// only the built-in services will be available
services: {
...kibanaConfig.get('services'),
myService: MyServiceProvider,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ Each of the factories have some of the custom parameters, which will be describe
[[development-base-visualization-type]]
==== Base Visualization Type
The base visualization type does not make any assumptions about the rendering technology you are going to use and
works with pure Javascript. It is the visualization type we recommend to use.
works with pure JavaScript. It is the visualization type we recommend to use.

You need to provide a type with a constructor function, a render method which will be called every time
options or data change, and a destroy method which will be called to cleanup.
Expand Down Expand Up @@ -387,7 +387,7 @@ import { VisFactoryProvider } from 'ui/vis/vis_factory';
const myResponseHandler = (vis, response) => {
// transform the response (based on vis object?)
const resposne = ... transform data ...;
const response = ... transform data ...;
return response;
};
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ found. It will reject, if the `id` is invalid.

The returned `EmbeddedVisualizeHandler` itself has the following methods and properties:

- `destroy()`: destroys the underlying Angualr scope of the visualization
- `destroy()`: destroys the underlying Angular scope of the visualization
- `getElement()`: a reference to the jQuery wrapped DOM element, that renders the visualization
- `whenFirstRenderComplete()`: will return a promise, that resolves as soon as the visualization has
finished rendering for the first time
Expand Down
2 changes: 1 addition & 1 deletion docs/management.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ patterns, advanced settings that tweak the behaviors of Kibana itself, and
the various "objects" that you can save throughout Kibana such as searches,
visualizations, and dashboards.

This section is pluginable, so in addition to the out of the box capabitilies,
This section is pluginable, so in addition to the out of the box capabilities,
packs such as {xpack} can add additional management capabilities to Kibana.
--

Expand Down
2 changes: 1 addition & 1 deletion docs/management/managing-fields.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ options are {ref}/modules-scripting-expression.html[Lucene expressions] and {ref
While you can use other scripting languages if you enable dynamic scripting for them in Elasticsearch, this is not recommended
because they cannot be sufficiently {ref}/modules-scripting-security.html[sandboxed].

WARNING: Use of Groovy, Javascript, and Python scripting is deprecated starting in Elasticsearch 5.0, and support for those
WARNING: Use of Groovy, JavaScript, and Python scripting is deprecated starting in Elasticsearch 5.0, and support for those
scripting languages will be removed in the future.

You can reference any single value numeric field in your expressions, for example:
Expand Down
2 changes: 1 addition & 1 deletion docs/monitoring/monitoring-xkib.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ used when {kib} sends monitoring data to the production cluster.

.. Configure {kib} to encrypt communications between the {kib} server and the
production cluster. This set up involves generating a server certificate and
setting `server.ssl.*` and `elasticsearch.ssl.certitifcateAuthorities` settings
setting `server.ssl.*` and `elasticsearch.ssl.certificateAuthorities` settings
in the `kibana.yml` file on the {kib} server. For example:
+
--
Expand Down
2 changes: 1 addition & 1 deletion docs/reporting/chromium-sandbox.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,6 @@ recommended to enable usernamespaces and set `xpack.reporting.capture.browser.ch
`kibana.yml` to enable the sandbox.

==== Docker
When runnning Kibana in a Docker container, all container processes are run within a usernamespace with seccomp-bpf and
When running Kibana in a Docker container, all container processes are run within a usernamespace with seccomp-bpf and
AppArmor profiles that prevent the Chromium sandbox from being used. In these situations, disabling the sandbox is recommended,
as the container implements similar security mechanisms.
4 changes: 2 additions & 2 deletions docs/reporting/development/csv-integration.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ interface jobParameters {
}
----

The `searchRequest.body` should abide by the {ref}/search-request-body.html[Elasticsearch Seach Request Body] syntax
The `searchRequest.body` should abide by the {ref}/search-request-body.html[Elasticsearch Search Request Body] syntax

[float]
==== `export-config` Directive
Expand All @@ -48,4 +48,4 @@ function getSharingTitle() string;
----

The `sharingData.searchRequest.body` should abide by the {ref}/search-request-body.html[Elasticsearch Seach Request Body] syntax
The `sharingData.searchRequest.body` should abide by the {ref}/search-request-body.html[Elasticsearch Search Request Body] syntax
2 changes: 1 addition & 1 deletion docs/reporting/development/index.asciidoc
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[role="xpack"]
[[reporting-integration]]
== Reporting Integration
Intergrating a Kibana application with {reporting} requires a minimum amount of code, and the goal is to not have to
Integrating a Kibana application with {reporting} requires a minimum amount of code, and the goal is to not have to
modify the Reporting code as we add additional applications. Instead, applications abide by a contract that Reporting
uses to determine the information that is required to export CSVs and PDFs.

Expand Down
8 changes: 4 additions & 4 deletions docs/reporting/reporting-troubleshooting.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ an explanation of why the failure occurred and what you can do to fix it.
[float]
==== `You must install fontconfig and freetype for Reporting to work'`
Reporting using PhantomJS, the default browser, relies on system packages. Install the appropriate fontconfig and freetype
packages for your distrobution.
packages for your distribution.

[float]
==== `Max attempts reached (3)`
Expand All @@ -35,16 +35,16 @@ setting. If the PDF report fails with "Max attempts reached (3)," check your <<r
[float]
==== `You must install freetype and ttf-font for Reporting to work`
Reporting using the Chromium browser relies on system packages and at least one system font. Install the appropriate fontconfig and freetype
packages for your distrobution and at least one system font.
packages for your distribution and at least one system font.

[float]
==== `You must install nss for Reporting to work`
Reporting using the Chromium browser relies on the Network Security Service libraries (NSS). Install the appropriate nss package for your distrobution.
Reporting using the Chromium browser relies on the Network Security Service libraries (NSS). Install the appropriate nss package for your distribution.

[float]
==== `Unable to use Chromium sandbox. This can be disabled at your own risk with 'xpack.reporting.capture.browser.chromium.disableSandbox'`
Chromium uses sandboxing techniques that are built on top of operating system primitives. The Linux sandbox depends on user namespaces,
which were introduced with the 3.8 Linux kernel. However, many distrobutions don't have user namespaces enabled by default, or they require
which were introduced with the 3.8 Linux kernel. However, many distributions don't have user namespaces enabled by default, or they require
the CAP_SYS_ADMIN capability.

Elastic recommends that you research the feasibility of enabling unprivileged user namespaces before disabling the sandbox. An exception
Expand Down
2 changes: 1 addition & 1 deletion docs/setup.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ version of Kibana that is newer than the version of Elasticsearch (e.g. Kibana
5.1 and Elasticsearch 5.0).

Running a minor version of Elasticsearch that is higher than Kibana will
generally work in order to faciliate an upgrade process where Elasticsearch
generally work in order to facilitate an upgrade process where Elasticsearch
is upgraded first (e.g. Kibana 5.0 and Elasticsearch 5.1). In this
configuration, a warning will be logged on Kibana server startup, so it's only
meant to be temporary until Kibana is upgraded to the same version as
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ image::images/timelion-conditional02.png[]

For additional information on Timelions conditional capabilities, check out the blog post https://www.elastic.co/blog/timeseries-if-then-else-with-timelion[I have but one .condition()].

Now that you have thresholds defined to easily identify outliers, let’s create a new series to determine what the trend really is. Timelion's `mvavg()` function allows you to calculate the moving average over a given window. This is especially helpful for noisey time series. For this tutorial, you will use `.mvavg(10)` to create a moving average with a window of 10 data points. Use the following expression to create a moving average of the maximum memory usage:
Now that you have thresholds defined to easily identify outliers, let’s create a new series to determine what the trend really is. Timelion's `mvavg()` function allows you to calculate the moving average over a given window. This is especially helpful for noisy time series. For this tutorial, you will use `.mvavg(10)` to create a moving average with a window of 10 data points. Use the following expression to create a moving average of the maximum memory usage:

[source,text]
----------------------------------
Expand Down
2 changes: 1 addition & 1 deletion docs/timelion/getting-started/timelion-create.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ To start, you will need to define an `index`, `timefield` and `metric` in the fi
image::images/timelion-create01.png[]
{nbsp}

Now you need to add another series with data from the previous hour for comparison. To do so, you'll have to add an `offset` arguement to the `.es()` function. `offset` will offset the series retrieval by a date expression. For this example, you'll want to offset the data back one hour and will be using the date expression `-1h`. Using a comma to separate the two series, enter the following expression into the Timelion query bar:
Now you need to add another series with data from the previous hour for comparison. To do so, you'll have to add an `offset` argument to the `.es()` function. `offset` will offset the series retrieval by a date expression. For this example, you'll want to offset the data back one hour and will be using the date expression `-1h`. Using a comma to separate the two series, enter the following expression into the Timelion query bar:

[source,text]
----------------------------------
Expand Down
2 changes: 1 addition & 1 deletion docs/timelion/getting-started/timelion-customize.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Before making any other modifications, append the `title()` function to the end
image::images/timelion-customize01.png[]
{nbsp}

To differentiate the last hour series a bit more, you are going to change the chart type to an area chart. In order do so, you'll need to use the `.lines()` function to customize the line chart. You'll be setting the `fill` and `width` arguements to set the fill of the line chart and line width respectively. In this example, you will set the fill level to 1 and the width of the border to 0.5 by appending `.lines(fill=1,width=0.5)`. Use the following expression in the Timelion query bar:
To differentiate the last hour series a bit more, you are going to change the chart type to an area chart. In order do so, you'll need to use the `.lines()` function to customize the line chart. You'll be setting the `fill` and `width` arguments to set the fill of the line chart and line width respectively. In this example, you will set the fill level to 1 and the width of the border to 0.5 by appending `.lines(fill=1,width=0.5)`. Use the following expression in the Timelion query bar:

[source,text]
----------------------------------
Expand Down
2 changes: 1 addition & 1 deletion docs/visualize.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ _buckets_ for the x-axis. Buckets are analogous to SQL `GROUP BY`
statements. Pie charts, use the metric for the slice size and the bucket
for the number of slices.

You can futher break down the data by specifying sub aggregations. The first
You can further break down the data by specifying sub aggregations. The first
aggregation determines the data set for any subsequent aggregations. Sub
aggregations are applied in order--you can drag the aggregations to change the
order in which they're applied.
Expand Down
2 changes: 1 addition & 1 deletion docs/visualize/time-series-visual-builder.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ A histogram visualization that supports area, line, bar, and steps along with
multiple y-axis. You can fully customize the colors, points, line thickness
and fill opacity. This visualization also supports time shifting to compare two
time periods. This visualization also supports annotations which can be loaded from
a seperate index based on a query.
a separate index based on a query.

image:images/tsvb-timeseries.png["Time Series Visualization"]

Expand Down
2 changes: 1 addition & 1 deletion packages/eslint-config-kibana/.eslintrc.js
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ module.exports = {
'wrap-iife': [ 'error', 'outside' ],
yoda: 'off',

'object-curly-spacing': 'off', // overriden with babel/object-curly-spacing
'object-curly-spacing': 'off', // overridden with babel/object-curly-spacing
'babel/object-curly-spacing': [ 'error', 'always' ],

'jsx-quotes': ['error', 'prefer-double'],
Expand Down
2 changes: 1 addition & 1 deletion packages/kbn-dev-utils/src/proc_runner/proc.js
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ export function createProc(name, { cmd, args, cwd, env, stdin, log }) {
STOP_TIMEOUT,
async () => {
throw new Error(
`Proc "${name}" was stopped but never emiited either the "exit" or "error" event after ${STOP_TIMEOUT} ms`
`Proc "${name}" was stopped but never emitted either the "exit" or "error" event after ${STOP_TIMEOUT} ms`
);
}
);
Expand Down
2 changes: 1 addition & 1 deletion packages/kbn-dev-utils/src/streams/promise_from_streams.js
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
* If the last stream is readable, it's final value
* will be provided as the promise value.
*
* Errors emmitted from any stream will cause
* Errors emitted from any stream will cause
* the promise to be rejected with that error.
*
* @param {Array<Stream>} streams
Expand Down
2 changes: 1 addition & 1 deletion packages/kbn-dev-utils/src/tooling_log/__tests__/log.js
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ describe('utils: createToolingLog(logLevel, output)', () => {
});
describe('invalid logLevel', () => {
it('throw error', () => {
// avoid the impossiblity that a valid level is generated
// avoid the impossibility that a valid level is generated
// by specifying a long length
const level = chance.word({ length: 10 });

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ describe('parseLogLevel(logLevel).flags', () => {

describe('invalid logLevel', () => {
it('throws error', () => {
// avoid the impossiblity that a valid level is generated
// avoid the impossibility that a valid level is generated
// by specifying a long length
const level = chance.word({ length: 10 });

Expand Down
2 changes: 1 addition & 1 deletion packages/kbn-es/src/install/archive.js
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ exports.installArchive = async function installArchive(archive, options = {}) {
};

/**
* Recurive deletion for a directory
* Recursive deletion for a directory
*
* @param {String} path
*/
Expand Down
2 changes: 1 addition & 1 deletion packages/kbn-es/src/install/snapshot.js
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ exports.installSnapshot = async function installSnapshot({
* @param {String} url
* @param {String} dest
* @param {ToolingLog} log
* @returns {Promose}
* @returns {Promise}
*/
function downloadFile(url, dest, log) {
const downloadPath = `${dest}.tmp`;
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP

exports[`parses data containing execption 1`] = `"[o.e.n.Node] [qEfPPg8] starting ..."`;
exports[`parses data containing exception 1`] = `"[o.e.n.Node] [qEfPPg8] starting ..."`;

exports[`parses data containing execption 2`] = `
exports[`parses data containing exception 2`] = `
"[o.e.b.ElasticsearchUncaughtExceptionHandler] [] uncaught exception in thread [main]
org.elasticsearch.bootstrap.StartupException: BindHttpException; nested: BindException[Address already in use];
at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:125) ~[elasticsearch-7.0.0.jar:7.0.0-alpha1-SNAPSHOT]
Expand All @@ -15,7 +15,7 @@ Caused by: java.net.BindException: Address already in use
at java.lang.Thread.run(Thread.java:844) [?:?]"
`;

exports[`parses data containing execption 3`] = `"[o.e.g.GatewayService] [qEfPPg8] recovered [0] indices into cluster_state"`;
exports[`parses data containing exception 3`] = `"[o.e.g.GatewayService] [qEfPPg8] recovered [0] indices into cluster_state"`;

exports[`parses multiple lines 1`] = `"[o.e.p.PluginsService] [qEfPPg8] loaded plugin [x-pack-security]"`;

Expand Down
2 changes: 1 addition & 1 deletion packages/kbn-es/src/utils/extract_config_files.js
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ const mkdirp = require('mkdirp');

/**
* Copies config references to an absolute path to
* the provided destination. This is necicary as ES security
* the provided destination. This is necessary as ES security
* requires files to be within the installation directory
*
* @param {Array} config
Expand Down
2 changes: 1 addition & 1 deletion packages/kbn-es/src/utils/parse_es_log.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ test('parses multiple lines', () => {
expect(lines[1].message).toMatchSnapshot();
});

test('parses data containing execption', () => {
test('parses data containing exception', () => {
const data = dedent(`
[2018-02-23T10:13:45,646][INFO ][o.e.n.Node ] [qEfPPg8] starting ...
[2018-02-23T10:13:53,992][WARN ][o.e.b.ElasticsearchUncaughtExceptionHandler] [] uncaught exception in thread [main]
Expand Down
Loading

0 comments on commit 2b27fb1

Please sign in to comment.