TCS Wiki:Advanced text formatting and Dot product: Difference between pages

From TCS Wiki
(Difference between pages)
Jump to navigation Jump to search
imported>Kumioko
m (→‎Related pages: Fix typos, formatting, brackets and links, replaced: ==See also== → ==Related pages== using AWB (8759))
 
imported>Erich666
m (Point to relevant videos and remove credit.)
 
Line 1: Line 1:
{{essay|WP:TYPESET}}
In [[mathematics]], the '''dot product''' is an operation that takes two [[vector]]s as input, and that returns a [[scalar]] [[number]] as output.  The number returned is dependent on the length of both vectors, and on the angle between them. The name is derived from the [[Interpunct|centered dot]] "·" that is often used to designate this operation; the alternative name '''scalar product''' emphasizes the [[scalar (mathematics)|scalar]] (rather than [[Euclidean vector|vector]]) nature of the result.
This essay, '''Wikipedia:Advanced text formatting''' or '''Advanced typesetting''', describes many techniques to control (or adjust) the alignment of text on a page. For people with professional backgrounds in [[typesetting]], this essay is not intended as a joke, but rather, an advancement over the default typesetting of stub articles. Techniques listed here are still intended for general readers.
__TOC__
==Moving vanity-boxes lower in articles==
Perhaps the single greatest improvement to many articles is to lower those grandstanding top tag-boxes that proclaim, ''"This article is bad: change it now"''. Most of those tag-box templates allow&nbsp;a parameter "|section" when lowering the tag-box further down the page. For example: &#123;{<code>RefImprove|section|date=May 2009</code>}}. Moving a distracting tag-box can greatly improve the readability for readers, who might otherwise become surprised and alarmed by a 2-year-old gripe box which someone threw on the page, unopposed, years ago. Some tag-boxes give the idea that the box must be read to avoid very dangerous [[information]] in an article.


==Avoiding wrap of end-quote or apostrophe==
The dot product contrasts (in three dimensional space) with the [[cross product]], which produces a vector as result.
One of the most troublesome typesetting [[wikt:glitch|glitch]]es is the wrapping of the last word in a quotation onto a second line, when followed by [[parenthesis|parentheses]] or brackets "[ ]":
{| class=wikitable width=410 align=center
|Typical wrapping of end-quotemark:
: "The quick brown fox jumped over the lazy<br>dogs" (typewriter exercise).
|}
There are several ways to allow the end-word to stay on the same line, without wrapping. Perhaps the most common fix is to append the blank-code "&amp;#160;" (or even a comma) after the end [[Quotation mark|quotemark]] so that it will not wrap too soon:


{| class=wikitable width=415 align=center
==Definition==
|Wrapping of end-quotemark plus &#160:
The dot product of two vectors '''a''' = [''a''<sub>1</sub>, ''a''<sub>2</sub>, ..., ''a''<sub>''n''</sub>] and '''b''' = [''b''<sub>1</sub>, ''b''<sub>2</sub>, ..., ''b''<sub>''n''</sub>] is defined as:
: "The quick brown fox jumped over the lazy dogs"&#160; (typewriter exercise).
:<math>\mathbf{a}\cdot \mathbf{b} = \sum_{i=1}^n a_ib_i = a_1b_1 + a_2b_2 + \cdots + a_nb_n </math>
|}
where Σ denotes [[Summation|summation notation]] ( the sum of all the terms) and ''n'' is the dimension of the vector space.
To fix wrapping, the word ''dogs'' is followed by &amp;#160 (after the quotemark:&nbsp; dogs"&amp;#160; ). Because thousands of articles begin with formally defining a term, with stating a quoted meaning, the forced wrapping of end-quotemarks has become a major typesetting nightmare in Wikipedia. The wrapping of end-quotes grew to be so common, during 2005-2010, that it has become instinctive to expect an end-quote to almost always be prematurely wrapped onto a second line, and the first line to be truncated as bizarrely too short. The premature wrapping of the 18-character phrase "dogs...typewriter" is typical, not an exaggeration of how much text gets forced onto the second line in many articles.


A similar problem occurs with an end-apostrophe & parentheses:
In dimension 2, the dot product of vectors [a,b] and [c,d] is ac + bd.
{| class=wikitable width=370 align=center
The same way, in a dimension 3, the dot product of vectors [a,b,c] and [d,e,f] is ad + be + cf.
|Typical wrapping of end-apostrophe:
For example, the dot product of two three-dimensional vectors [1, 3, −5] and [4, −2, −1] is
: The film ''[[101 Dalmatians]]'' concerns all the [[Dalmatian (dog)|dalmatians]]' safety (''problem only if a parenthesis after apostrophe).
: The film ''[[101 Dalmatians]]'' concerns all the [[Dalmatian (dog)|dalmatians]]' (there are 100+1 dogs) safety.
|-
|Wrapping of end-apostrophe plus &160:
: The film ''[[101 Dalmatians]]'' concerns all the [[Dalmatian (dog)|dalmatians]]'&#160; (there are 100+1 dogs) safety.
|}
Besides using "&amp;#160;" other characters, such as comma, semicolon or slash, could be appended after the end-quotemark, if they fit the meaning. There might be other situations of forced wrapping in Wikipedia text.


==Setting small font-size of lesser text==
:<math>
Lesser text can be reduced to a smaller ''[[font size]]'', such as by using a span-tag:
[1, 3, -5] \cdot [4, -2, -1] = (1 \times 4) + (3 \times (-2)) + ((-5) \times (-1)) = (4) - (6) + (5) = 3.
:: &lt;span style="font-size:87%">German: ''Der Lange-Annoying-Name-der-Dinge''&lt;/span>
</math>
That font-size will shrink the text somewhat: <span style="font-size:87%">German: ''Der Lange-Annoying-Name-der-Dinge''</span>. A highly irritating problem can be the placement of too much foreign (or off-topic text) in the intro section. Much tangent-level wording should be moved to lower sections. However, the use of a reduced font-size can help minimize the glaring impact of off-topic text. Sizes such as 95% or 92% retain the original font shape; however, sizes of 87% or 82% might be needed. To reduce a larger section of text, consider using the paired &lt;div>...&lt;/div> tags (instead of "&lt;span>...&lt;/span>").


The default small text-size, with almost no shape, is selected by &lt;small>aa bb cc xx yy zz&lt;/small>, which appears as: <small>aa bb cc xx yy zz</small>. Using &lt;font face=Georgia>, to switch from default [[Arial font]] to <font face=Georgia>[[Georgia font]], the small text will appear as: <small>small Georgia-font a b c x y z</small></font>.
==Geometric interpretation==
[[File:Dot Product.svg|thumb|300px|right|'''A''' • '''B''' = <nowiki>|</nowiki>'''A'''<nowiki>|</nowiki> <nowiki>|</nowiki>'''B'''<nowiki>|</nowiki> cos(θ). <br /><nowiki>|</nowiki>'''A'''<nowiki>|</nowiki> cos(θ) is the [[scalar resolute|scalar projection]] of '''A''' onto '''B'''.]]


==Reducing line-height when wrapping small text==
In [[Euclidean geometry]], the dot product, [[Euclidean norm|length]], and [[angle]] are related. For a vector '''a''', the dot product '''a''' · '''a''' is the square of the length of '''a''', or
A very common problem, when using a smaller font, is the gapping caused by large [[interline spacing]] between the lines. A better [[line height|line-height]] (for small-font notes) is: 1.3[[em]], such as by:
:: &lt;span style="font-size:87%; '''''line-height:''''' 1.3em;">xxx&lt;/span>
Such small lines could be used in a lengthy image caption, where the typical caption size would take too much space, for the amount of detailed caption being displayed. Of course, once again, a full solution often involves removing some excess text (from the image-caption) to a lower spot on the page, and then referring to the image, such as "(''see image at right'')" from that text.


The colon-indent prefix ":" which indents lines (by about 7 spaces) also triggers a smaller line-height, so any indented wrapped-text will appear closer to the upper-text on the indented line.
:<math>{\mathbf{a} \cdot \mathbf{a}}=\left\|\mathbf{a}\right\|^2</math>


==Reducing line-height in a quote-box==
where ||'''a'''|| denotes the [[Euclidean norm|length]] (magnitude) of '''a'''. More generally, if '''b''' is another vector
Another use of reduced line-height might be for an indented quote-box, where the smaller line-height might help in emphasizing the quote as being a special text section. For example, using:
:: &lt;span style="font-size:92%; '''''line-height:''''' 1.33em;">xxx&lt;/span>
Then enclose the text of a quote, such as the following:
::: <span style="font-size:92%; line-height: 1.33em;">For Science is a natural whole, the parts of which<br>mutually support one another in a way which,<br>to be sure, no one can anticipate.<br>{{in5|32}}&mdash;[[Albert Einstein]], ''[[Out of My Later Years]]''</span>
Sometimes, the setting must be precise, where 1.30em would be too close, or 1.35 would separate lines too much, while 1.33em provides an even balance, for the particular lines in the quotation. Because the line-height is reduced, the effect of the quotation is different than merely indenting the text. Note how the line for author "Einstein" has been indented, far to the right: the new indenter template {{tl2|in5|32}} was used to indent across 32 spaces further than the quoted lines: &#123;{in5|32}}&amp;mdash;Albert...


==Expanding line-height for song lyrics or poems==
:<math> \mathbf{a} \cdot \mathbf{b}=\left\|\mathbf{a}\right\| \, \left\|\mathbf{b}\right\| \cos \theta \,</math>
The opposite technique, of ''increased'' line-height, might be used for some indented [[song lyrics]] (or [[poetry]]), where the larger line-height could help in emphasizing the "double-spaced" appearance of a text section. For example, using:
:: &lt;span style="font-size:92%; '''''line-height:''''' 2.1em;">xxx&lt;/span>
Then enclose the song lyrics as "xxx", with line-breaks &lt;br>, as follows:
:::: <span style="font-size:92%; line-height: 2.1em;">I hear Jerusalem, bells are ringing,<br>Roman cavalry, choirs are singing,{{in5|15}}<code>&lt;</code>--(cavalry are horse-soldiers)<br>"Be my mirror, my sword and shield,<br>My missionairies in a foreign field",<br>For some reason....{{in5|10}}&mdash;[[Coldplay]], ''"[[Viva la Vida]]"''</span> <sup>[a]</sup>
Because the line-height is increased, the effect of the quoted text appears similar to having used double-spaced lines in the text. The extra spacing also helps for annotations, such as noting, in the chorus (for the 2008 song "[[Viva la Vida]]") how "Roman cavalry" refers to horse-soldiers, while the "choirs are singing" the lines "Be my mirror" (etc.). The length of each line is chosen to reflect the cadence or rhythm of the music, where the singer could take a breath at the end of each line. (Only part of the lyrics are listed, to keep them short, per copyright laws limiting to 10%, or prohibiting the display of an entire performable unit, of a song).


The exact coding of the song lyrics could be a single line, as follows:
where ||'''a'''|| and ||'''b'''|| denote the length of '''a''' and '''b''' and ''θ'' is the [[angle]] between them.
: <nowiki>:::: <span style="font-size:92%; line-height: 2.1em;">I hear Jerusalem, bells are ringing, <br>Roman cavalry, choirs are singing,{{in5|15}}<code>&amp;lt;</code>--(cavalry are horse-soldiers) <br>"Be my mirror, my sword and shield,<br>My missionairies in a foreign field",<br>For some reason....{{in5|10}}&mdash;[[Coldplay]], ''"[[Viva la Vida]]"''</span> <sup>[a]</sup></nowiki>
For coding on multiple lines, use "&lt;div>" rather than "&lt;span>" tags.


==Word-joining to avoid one-word-per-line==
This formula can be rearranged to determine the size of the angle between two nonzero vectors:
Words can be joined by "&amp;nbsp;" or {{tl2|nowrap|xxx xxx}} to force them to appear together, on one line. Sometimes text, next to a wide image-box or wide [[WP:infobox|infobox]], tends to get squeezed into a narrow column of text. A very narrow column can cause text-wrapping as, sometimes, one-word-per-line, all the way down the entire column. By word-joining the first few words of a phrase (such as "<font color="#666666">'''At'''&amp;nbsp;'''the'''&amp;nbsp;'''outset'''</font>"&nbsp; or&nbsp; "<font color="#666666">'''The'''&amp;nbsp;'''region'''&amp;nbsp;'''covers'''</font>"), the text can be forced down the page, into a wider column, where all the joined-words can fit side-by-side, across the line.  Then, even when the page is viewed in larger browser Text-Size settings, the joined text will float down to columns where the typesetting looks more logical, rather than the default, of one-word-per-line, in a narrow column.


==Bold but not too bold==
:<math>\theta=\arccos \left( \frac {\bold{a}\cdot\bold{b}} {\left\|\bold{a}\right\|\left\|\bold{b}\right\|}\right)</math>
Bold-faced text can be softened, or thinned, by using <font color="#777777">'''dark-gray'''</font>, rather than typical black, as the text font-color. For example:
::<nowiki><font color="#666666">'''One''' and '''Two''' and '''Three'''</font></nowiki>
The dark-gray color (#666666) will appear as:
<font color="#666666">'''One''' and '''Two''' and '''Three'''</font>". Compare the bolded text of One/Two, against the stark contrast, of the shortcut title of this essay page: '''WP:TYPESET'''.


In Wikipedia, the use of bold-faced text is used, primarily, to highlight words that are titles, or redirected terms, that name each article. Such bold-faced text could be confusing when not connected to the article title, so the use of a lighter bolded text allows for highlighting, with less confusion about the article-title words.
One can also first convert the vectors to [[unit vector]]s by dividing by their magnitude:
:<math>\boldsymbol{\hat{a}} = \frac{\bold{a}}{\left\|\bold{a}\right\|}</math>
then the angle ''θ'' is given by
:<math>\theta =  \arccos ( \boldsymbol{\hat a}\cdot\boldsymbol{\hat b})</math>


Also, other, lively colors could be bolded without much confusion with the article-title words (''see color choices in:'' [[Web colors]]).
As the [[cosine]] of 90° is zero, the dot product of two [[orthogonal]](perpendicular) vectors is always zero. Moreover, two vectors can be considered [[orthogonal]] if and only if their dot product is zero, and they both have a nonzero length. This property provides a simple method to test the condition of orthogonality.


==Auto-indenting of text==
Sometimes these properties are also used for ''defining'' the dot product, especially in 2 and 3 dimensions; this definition is equivalent to the above one. For higher dimensions the formula can be used to define the concept of angle.
Text can be auto-indented, depending on screen width, using ":" followed by "&amp;nbsp;" as follows:
<pre>
: &amp;nbsp; {{nowrap|"There is no substitute for knowledge". -Deming}}
</pre>
The indented line will indent less, on a narrow screen, because the non-breaking spaces will be on an upper line, and the text will shift to the next line. The result will appear as:
: &nbsp; &nbsp; {{nowrap|"There is no substitute for knowledge". -[[W. Edwards Deming|Deming]]}}


The reason for auto-indenting: sometimes, text needs to be indented on wide screens but not indented, as much, on narrow windows because it would not fit across the line, if fully indented on a narrow window. This is typically the case for a long math equation or formula, such as the following:
The geometric properties rely on the [[basis (linear algebra)|basis]] being [[orthonormal]], i.e. composed of pairwise perpendicular vectors with unit length.
<pre>
: &amp;nbsp; &amp;nbsp; <math>I_D= \mu_n C_{ox}\frac{W}{L} \left(
&nbsp; &nbsp; &nbsp; &nbsp; (V_{GS}-V_{th})V_{DS}-\frac{V_{DS}^2}{2} \right)</math>
</pre>
The generated formula will appear as:
:&nbsp; &nbsp; <math>I_D= \mu_n C_{ox}\frac{W}{L} \left( (V_{GS}-V_{th})V_{DS}-\frac{V_{DS}^2}{2} \right)</math>
On a very-narrow window, then the formula would auto-indent with less left-side spacing.


Such use of auto-indented text is, typically, rare, but can avoid large text-gaps where the page would become half-blank on a narrow window. Otherwise, many long equations (especially, near images or [[WP:infobox|infoboxes]]) would be shifted down a page, causing a wide text-gap to appear, because they would be too long to fit when fully indented.
===Scalar projection===
If both '''a''' and '''b''' have length one (i.e., they are [[unit vector]]s), their dot product simply gives the cosine of the angle between them.


==Changing font faces==
If only '''b''' is a [[unit vector]], then the dot product '''a''' '''·''' '''b''' gives |'''a'''| cos(θ), i.e., the magnitude of the projection of '''a''' in the direction of '''b''', with a minus sign if the direction is opposite. This is called the [[scalar resolute|scalar projection]] of '''a''' onto '''b''', or [[vector (geometry)|scalar component]] of '''a''' in the direction of '''b''' (see figure). This property of the dot product has several useful applications (for instance, see next section).
The [[MediaWiki]] markup language supports many HTML tags, including "&lt;font face=Garamond>" and such. Some of the fonts are:
* Garamond: <font face=Garamond>This is [[Garamond|Garamond font]].</font>
* Georgia: &nbsp; &nbsp; <font face=Georgia>This is [[Georgia font]].</font>
* Courier: &nbsp; &nbsp; <font face=Courier>This is [[Courier (typeface)|Courier font]].</font>
* Helvetica: &nbsp; <font face=Helvetica>This is [[Helvetica|Helvetica font]].</font>
* Times Roman: <font face=Times Roman>This is [[Times Roman font]].</font>
* Arial: &nbsp; &nbsp; &nbsp; <font face=Arial>This is [[Arial font]] (default).</font>


==Using commas & repetition for clarity==
If neither '''a''' nor '''b''' is a unit vector, then the magnitude of the projection of '''a''' in the direction of '''b''', for example, would be '''a''' '''·''' ('''b''' / |'''b'''|) as the unit vector in the direction of '''b''' is '''b''' / |'''b'''|.
Adding commas, and repeating key-phrases, can clarify many long, complex sentences. Consider the following text:
: In Japan along dark side streets small vending machines are used to sell food and drinks where in many American cities such machines would be enclosed in steel-reinforced cages.
The above text tends to run together, with the result of seeming to be too long for a proper sentence. However, consider the addition of some commas, and also, repeating some words:
: In Japan, along dark side-streets, small vending machines are used to sell food and drinks, whereas in many American cities, such machines would be enclosed in steel-reinforced cages, if on dark side-streets.
The commas clearly separate the long sentence into specific phrases. The subject of "vending machines on dark side-streets" is clarified, at the end, by repeating "on dark side-streets" which readers might have forgotten, since that was the 2nd phrase in the long sentence. By using the trick of adding commas and repeating key-phrases, many long sentences can appear clarified, without extensive re-writing. The commas act, almost like magic, to simplify a long sentence, without the need to drastically cut and reword the text to be easier to read. The term "whereas" was used to be more specific than the word "where" (other similar precise terms, used to help clarify, include ''"instead"'' or ''"rather than"'' ). Often, it is not necessary to re-write technical articles for better clarity; instead, just add several commas and see if an article can be easily clarified, within minutes, rather than spend hours re-writing, or splitting, the technical descriptions.


==Undenting/bracketing of text==
===Rotation===
Real typesetting software, for over 30 years, typically has had simple directives to trigger alignments as left, right, center, or ''undented'' (beyond the left-margin line). However, for decades, HTML has had only limited options for easy alignment (one: &lt;center>). A method for undenting the first word of a paragraph is to put the paragraph into a text-table, where the first word (or syllable) is (alone) in column 1, while the other text is in column 2. For example, undenting "Beethoven":
A [[rotation (mathematics)|rotation]] of the orthonormal basis in terms of which vector '''a''' is represented is obtained with a multiplication of '''a''' by a [[rotation matrix]] '''R'''. This [[matrix multiplication]] is just a compact representation of a sequence of dot products.
: Wikicode: <nowiki> :::<table cellspacing=0 cellpadding=0><tr><td valign=top>Bee<td>thoven</nowiki><br>{{in5|16}}<nowiki>composed [[Moonlight Sonata]]<br>while he was losing his hearing.</table></nowiki>
: Results:
:::<table cellspacing=0 cellpadding=0><tr><td valign=top>Bee<td>thoven composed [[Moonlight Sonata]]<br>while he was losing his hearing.</table>


Note the use of both "cellspacing=0 cellpadding=0" so as to not separate the spacing between the first syllable "Bee" and "thoven".
For instance, let
*''B''<sub>1</sub> = {'''x''', '''y''', '''z'''} and ''B''<sub>2</sub> = {'''u''', '''v''', '''w'''} be two different [[orthonormal basis|orthonormal bases]] of the same space '''R'''<sup>3</sup>, with ''B''<sub>2</sub> obtained by just rotating ''B''<sub>1</sub>,
*'''a'''<sub>1</sub> = (a<sub>x</sub>, a<sub>y</sub>, a<sub>z</sub>) represent vector '''a''' in terms of ''B''<sub>1</sub>,
* '''a'''<sub>2</sub> = (a<sub>u</sub>, a<sub>v</sub>, a<sub>w</sub>) represent the same vector in terms of the rotated basis ''B''<sub>2</sub>,
*'''u'''<sub>1</sub>, '''v'''<sub>1</sub>, '''w'''<sub>1</sub> be the rotated basis vectors '''u''', '''v''', '''w''' represented in terms of ''B''<sub>1</sub>.
Then the rotation from ''B''<sub>1</sub> to ''B''<sub>2</sub> is performed as follows:


A third column can be used to enclose text in outside brackets, then putting the closing-bracket "]" in column 3, as follows:
:<math> \bold a_2 = \bold{Ra}_1 =
\begin{bmatrix} u_x & u_y & u_z \\ v_x & v_y & v_z \\ w_x & w_y & w_z \end{bmatrix}
\begin{bmatrix} a_x \\ a_y \\ a_z \end{bmatrix} =
\begin{bmatrix} \bold u_1\cdot\bold a_1 \\ \bold v_1\cdot\bold a_1 \\ \bold w_1\cdot\bold a_1 \end{bmatrix} = \begin{bmatrix} a_u \\ a_v \\ a_w \end{bmatrix} .
</math>


: Wikicode: <nowiki> :::<table><tr><td valign=top>[<td>This is line 1.<br>Line 2.<td>]</table></nowiki>
Notice that the rotation matrix '''R''' is assembled by using the rotated basis vectors '''u'''<sub>1</sub>, '''v'''<sub>1</sub>, '''w'''<sub>1</sub> as its rows, and these vectors are unit vectors. By definition, '''Ra'''<sub>1</sub> consists of a sequence of dot products between each of the three rows of '''R''' and vector '''a'''<sub>1</sub>. Each of these dot products determines a scalar component of '''a''' in the direction of a rotated basis vector (see previous section).
: Results:
::: <table><tr><td valign=top>[<td>This is line 1.<br>Line 2.<td valign=bottom>]</table>


Again, the designers of the [[HTML]] language had only limited knowledge of typesetting (thus they invented: font size=1 to 5!), so the only alignment directive was "&lt;center>". However, the currently invalid options "&lt;left>" and "&lt;right>" could be added someday, as a trivial implementation, because to handle "center" then the left/right margins must already be known to the computer. The pitfalls of HTML, developed with little knowledge of typesetting (or even of computer languages), are typical when amateurs (or [[college dropout]]s) try to create a new technology. Note that there are some examples of the opposite effect: such as an opera singer ([[Luciano Pavarotti]]) learning to sing well without being able to read music. However, in general, most attempts by hacks are botched failures. It is important to seek the knowledge of experts, but hacks might not even understand the basics that experts know, so some humility is needed in such discussions.
If '''a'''<sub>1</sub> is a [[row vector]], rather than a [[column vector]], then '''R''' must contain the rotated basis vectors in its columns, and must post-multiply '''a'''<sub>1</sub>:
 
:<math> \bold a_2 = \bold a_1 \bold R =
\begin{bmatrix} a_x & a_y & a_z \end{bmatrix}
\begin{bmatrix} u_x & v_x & w_x \\ u_y & v_y & w_y \\ u_z & v_z & w_z \end{bmatrix} =
\begin{bmatrix} \bold u_1\cdot\bold a_1 & \bold v_1\cdot\bold a_1 & \bold w_1\cdot\bold a_1 \end{bmatrix} = \begin{bmatrix} a_u & a_v & a_w \end{bmatrix} .
</math>
 
==Physics==
In [[physics]], magnitude is a [[scalar (physics)|scalar]] in the physical sense, i.e. a [[physical quantity]] independent of the coordinate system, expressed as the [[product (mathematics)|product]]  of a [[number|numerical value]] and a [[physical unit]], not just a number. The dot product is also a scalar in this sense, given by the formula, independent of the coordinate system.
Example:
* [[Mechanical work]] is the dot product of [[Force (physics)|force]] and [[Displacement (vector)|displacement]] vectors.
* [[Magnetic flux]] is the dot product of the [[magnetic field]] and the [[Area vector|area]] vectors.
 
==Properties==
The following properties hold if '''a''', '''b''', and '''c''' are real [[vector (geometry)|vectors]] and ''r'' is a [[scalar (mathematics)|scalar]].
 
The dot product is [[commutative]]:
:<math> \mathbf{a} \cdot \mathbf{b} = \mathbf{b} \cdot \mathbf{a}.</math>
 
The dot product is [[distributive]] over vector addition:
:<math> \mathbf{a} \cdot (\mathbf{b} + \mathbf{c}) = \mathbf{a} \cdot \mathbf{b} + \mathbf{a} \cdot \mathbf{c}. </math>
 
The dot product is  [[bilinear form|bilinear]]:
:<math> \mathbf{a} \cdot (r\mathbf{b} +  \mathbf{c})
    = r(\mathbf{a} \cdot  \mathbf{b}) +(\mathbf{a} \cdot \mathbf{c}).
</math>
 
When multiplied by a scalar value, dot product satisfies:
:<math> (c_1\mathbf{a}) \cdot (c_2\mathbf{b}) = (c_1c_2) (\mathbf{a} \cdot \mathbf{b}) </math>
(these last two properties follow from the first two).
 
Two non-zero vectors '''a''' and '''b''' are [[perpendicular]] [[if and only if]] '''a''' • '''b''' = 0.
 
Unlike multiplication of ordinary numbers, where if ''ab'' = ''ac'', then ''b'' always equals ''c'' unless ''a'' is zero, the dot product does not obey the [[cancellation law]]:
: If '''a''' • '''b''' = '''a''' • '''c''' and '''a''' ≠ '''0''', then we can write: '''a''' • ('''b''' − '''c''') = 0 by the [[distributive law]]; the result above says this just means that '''a''' is perpendicular to ('''b''' − '''c'''), which still allows ('''b''' − '''c''') ≠ '''0''', and therefore '''b''' ≠ '''c'''.
 
Provided that the basis is orthonormal, the dot product is invariant under isometric changes of the basis: rotations, reflections, and combinations, keeping the origin fixed. The above mentioned geometric interpretation relies on this property. In other words, for an orthonormal space with any number of dimensions, the dot product is invariant under a [[coordinate transformation]] based on an [[orthogonal matrix]]. This corresponds to the following two conditions:
*The new basis is again orthonormal (i.e., it is orthonormal expressed in the old one).
*The new base vectors have the same length as the old ones (i.e., unit length in terms of the old basis).
 
If '''a''' and '''b''' are functions, then the derivative of '''a''' • '''b''' is '''a'''' • '''b''' + '''a''' • '''b''''
 
==Triple product expansion==
{{Main|Triple product}}
 
This is a very useful identity (also known as '''Lagrange's formula''') involving the dot- and  [[Cross product|cross-products]]. It is written as
 
:<math>\mathbf{a} \times (\mathbf{b} \times \mathbf{c}) = \mathbf{b}(\mathbf{a}\cdot\mathbf{c}) - \mathbf{c}(\mathbf{a}\cdot\mathbf{b})</math>
 
which is [[mnemonic|easier to remember]] as "BAC minus CAB", keeping in mind which vectors are dotted together. This formula is commonly used to simplify vector calculations in [[physics]].
 
==Proof of the geometric interpretation==
Consider the element of '''R'''<sup>n</sup>
:<math> \mathbf{v} = v_1 \mathbf{\hat{e}}_1 + v_2 \mathbf{\hat{e}}_2 + ... + v_n \mathbf{\hat{e}}_n. \, </math>
Repeated application of the [[Pythagorean theorem]] yields for its length |'''v'''|
:<math> |\mathbf{v}|^2 = v_1^2 + v_2^2 + ... + v_n^2. \,</math>
But this is the same as
:<math> \mathbf{v} \cdot \mathbf{v} = v_1^2 + v_2^2 + ... + v_n^2, \,</math>
so we conclude that taking the dot product of a vector '''v''' with itself yields the squared length of the vector.
; '''[[Lemma (mathematics)|Lemma]] 1''':<math> \mathbf{v} \cdot \mathbf{v} = |\mathbf{v}|^2. \, </math>
 
Now consider two vectors '''a''' and '''b''' extending from the origin, separated by an angle θ.  A third vector '''c''' may be defined as
:<math> \mathbf{c} \ \stackrel{\mathrm{def}}{=}\  \mathbf{a} - \mathbf{b}. \,</math>
creating a triangle with sides '''a''', '''b''', and '''c'''.  According to the [[law of cosines]], we have
:<math> |\mathbf{c}|^2 = |\mathbf{a}|^2 + |\mathbf{b}|^2 - 2 |\mathbf{a}||\mathbf{b}| \cos \theta. \,</math>
Substituting dot products for the squared lengths according to Lemma 1, we get
:<math>
  \mathbf{c} \cdot \mathbf{c}
= \mathbf{a} \cdot \mathbf{a}
+ \mathbf{b} \cdot \mathbf{b}
- 2 |\mathbf{a}||\mathbf{b}| \cos\theta. \,
</math> &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; ''(1)''
But as '''c''' ≡ '''a''' − '''b''', we also have
:<math>
  \mathbf{c} \cdot \mathbf{c}
= (\mathbf{a} - \mathbf{b}) \cdot (\mathbf{a} - \mathbf{b}) \,</math>,
which, according to the [[distributive law]], expands to
:<math>
  \mathbf{c} \cdot \mathbf{c}
= \mathbf{a} \cdot \mathbf{a}
+ \mathbf{b} \cdot \mathbf{b}
-2(\mathbf{a} \cdot \mathbf{b}). \,
</math> &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; ''(2)''
Merging the two '''c''' • '''c''' equations, ''(1)'' and ''(2)'', we obtain
:<math>
  \mathbf{a} \cdot \mathbf{a}
+ \mathbf{b} \cdot \mathbf{b}
-2(\mathbf{a} \cdot \mathbf{b})
= \mathbf{a} \cdot \mathbf{a}
+ \mathbf{b} \cdot \mathbf{b}
- 2 |\mathbf{a}||\mathbf{b}| \cos\theta. \,
</math>
Subtracting '''a''' • '''a''' + '''b''' • '''b''' from both sides and dividing by −2 leaves
:<math> \mathbf{a} \cdot \mathbf{b} = |\mathbf{a}||\mathbf{b}| \cos\theta. \, </math>
[[Q.E.D.]]
 
==Generalization==
<!-- Generalization can also include the idea of a scalar product using tensor notation. See the talk page under "more general definition". -->
 
The [[Inner product space|inner product]] generalizes the dot product to [[vector space|abstract vector spaces]] and is usually denoted by <math>\langle\mathbf{a}\, , \mathbf{b}\rangle</math>. Due to the geometric interpretation of the dot product the [[norm (mathematics)|norm]] ||'''a'''|| of a vector '''a''' in such an [[inner product space]] is defined as
 
:<math>\|\mathbf{a}\| = \sqrt{\langle\mathbf{a}\, , \mathbf{a}\rangle}</math>
 
such that it generalizes length, and the angle θ between two vectors '''a''' and '''b''' by
 
:<math> \cos{\theta} = \frac{\langle\mathbf{a}\, , \mathbf{b}\rangle}{\|\mathbf{a}\| \, \|\mathbf{b}\|}. </math>
 
In particular, two vectors are considered [[orthogonal]] if their inner product is zero
 
:<math> \langle\mathbf{a}\, , \mathbf{b}\rangle = 0.</math>
 
For vectors with complex entries, using the given definition of the dot product would lead to quite different geometric properties. For instance the dot product of a vector with itself can be an arbitrary complex number, and can be zero without the vector being the zero vector; this in turn would have severe consequences for notions like length and angle. Many geometric properties can be salvaged, at the cost of giving up the symmetric and bilinear properties of the scalar product, by alternatively defining
:<math>\mathbf{a}\cdot \mathbf{b} = \sum{a_i \overline{b_i}} </math>
where <span style="text-decoration: overline">''b<sub>i</sub>''</span> is the [[complex conjugate]] of ''b<sub>i</sub>''. Then the scalar product of any vector with itself is a non-negative real number, and it is nonzero except for the zero vector. However this scalar product is not linear in '''b''' (but rather [[conjugate linear]]), and the scalar product is not symmetric either, since
:<math> \mathbf{a} \cdot \mathbf{b} = \overline{\mathbf{b} \cdot \mathbf{a}} </math>.
This type of scalar product is nevertheless quite useful, and leads to the notions of [[Hermitian form]] and of general [[inner product space]]s.
 
The [[Frobenius inner product]] generalizes the dot product to matrices. It is defined as the sum of the products of the corresponding components of two matrices having the same size.
 
===Generalization to tensors===
The dot product between a [[tensor]] of order n and a tensor of order m is a tensor of order n+m-2. The dot product is worked out by multiplying and summing across a single index in both tensors. If <math>\mathbf{A}</math> and <math>\mathbf{B}</math> are two tensors with element representation <math>A_{ij\dots}^{k\ell\dots}</math> and <math>B_{mn\dots}^{p{\dots}i}</math> the elements of the dot product <math>\mathbf{A} \cdot \mathbf{B}</math> are given by
:<math>A_{ij\dots}^{k\ell\dots}B_{mn\dots}^{p{\dots}i} = \sum_{i=1}^n A_{ij\dots}^{k\ell\dots}B_{mn\dots}^{p{\dots}i}</math>
 
This definition naturally reduces to the standard vector dot product when applied to vectors, and matrix multiplication when applied to matrices.
 
Occasionally, a double dot product is used to represent multiplying and summing across two indices. The double dot product between two 2nd order tensors is a scalar.


==Related pages==
==Related pages==
{|
* [[Cauchy–Schwarz inequality]]
|
* [[Cross product]]
* [[WP:Advanced article editing]]
* [[Matrix multiplication]]
* [[WP:Advanced footnote formatting]]
* [[Physics]]
* [[WP:Advanced table formatting]]
* [[WP:Alert]]
* [[WP:Thinking outside the infobox]]
|width="30px"|&nbsp;<!--spacer-->
| valign=top |
* [[WP:Pruning article revisions]]
* [[WP:Overlink crisis]]
* [[WP:Authors of Wikipedia]]
* [[WP:Avoiding difficult users]]
|}
==Notes==
<div style="font-size:87%; line-height: 1.33em;">
:<sup>[a]</sup> - Only part of the lyrics are listed, to keep them short, per copyright laws limiting display to 10%, or prohibiting the display of an entire performable unit, of a composition.


</div>
==Other websites==
::::[ ''This essay is a draft to be expanded, later...'' ]
* {{mathworld|urlname=DotProduct|title=Dot product}}
* [http://behindtheguesses.blogspot.com/2009/04/dot-and-cross-products.html A quick geometrical derivation and interpretation of dot product]
* [http://xahlee.org/SpecialPlaneCurves_dir/ggb/Vector_Dot_Product.html Interactive GeoGebra Applet]
* [http://www.falstad.com/dotproduct/ Java demonstration of dot product]
* [http://www.cs.brown.edu/exploratories/freeSoftware/repository/edu/brown/cs/exploratories/applets/dotProduct/dot_product_guide.html Another Java demonstration of dot product]
* [http://www.mathreference.com/la,dot.html Explanation of dot product including with complex vectors]
* [http://demonstrations.wolfram.com/DotProduct/ "Dot Product"] by Bruce Torrence, [[Wolfram Demonstrations Project]], 2007.
* Intuitive explanation [https://www.youtube.com/watch?v=_WgRwRyssk0 video 1] and [https://www.youtube.com/watch?v=YyNnK0T0w9o video 2] from online [https://www.udacity.com/course/interactive-3d-graphics--cs291 Interactive 3D graphics course]


[[Category:Wikipedia essays|Advanced text formatting]]
[[Category:Linear algebra]]

Latest revision as of 00:42, 2 June 2016

In mathematics, the dot product is an operation that takes two vectors as input, and that returns a scalar number as output. The number returned is dependent on the length of both vectors, and on the angle between them. The name is derived from the centered dot "·" that is often used to designate this operation; the alternative name scalar product emphasizes the scalar (rather than vector) nature of the result.

The dot product contrasts (in three dimensional space) with the cross product, which produces a vector as result.

Definition

The dot product of two vectors a = [a1, a2, ..., an] and b = [b1, b2, ..., bn] is defined as:

[math]\displaystyle{ \mathbf{a}\cdot \mathbf{b} = \sum_{i=1}^n a_ib_i = a_1b_1 + a_2b_2 + \cdots + a_nb_n }[/math]

where Σ denotes summation notation ( the sum of all the terms) and n is the dimension of the vector space.

In dimension 2, the dot product of vectors [a,b] and [c,d] is ac + bd. The same way, in a dimension 3, the dot product of vectors [a,b,c] and [d,e,f] is ad + be + cf. For example, the dot product of two three-dimensional vectors [1, 3, −5] and [4, −2, −1] is

[math]\displaystyle{ [1, 3, -5] \cdot [4, -2, -1] = (1 \times 4) + (3 \times (-2)) + ((-5) \times (-1)) = (4) - (6) + (5) = 3. }[/math]

Geometric interpretation

File:Dot Product.svg
AB = |A| |B| cos(θ).
|A| cos(θ) is the scalar projection of A onto B.

In Euclidean geometry, the dot product, length, and angle are related. For a vector a, the dot product a · a is the square of the length of a, or

[math]\displaystyle{ {\mathbf{a} \cdot \mathbf{a}}=\left\|\mathbf{a}\right\|^2 }[/math]

where ||a|| denotes the length (magnitude) of a. More generally, if b is another vector

[math]\displaystyle{ \mathbf{a} \cdot \mathbf{b}=\left\|\mathbf{a}\right\| \, \left\|\mathbf{b}\right\| \cos \theta \, }[/math]

where ||a|| and ||b|| denote the length of a and b and θ is the angle between them.

This formula can be rearranged to determine the size of the angle between two nonzero vectors:

[math]\displaystyle{ \theta=\arccos \left( \frac {\bold{a}\cdot\bold{b}} {\left\|\bold{a}\right\|\left\|\bold{b}\right\|}\right) }[/math]

One can also first convert the vectors to unit vectors by dividing by their magnitude:

[math]\displaystyle{ \boldsymbol{\hat{a}} = \frac{\bold{a}}{\left\|\bold{a}\right\|} }[/math]

then the angle θ is given by

[math]\displaystyle{ \theta = \arccos ( \boldsymbol{\hat a}\cdot\boldsymbol{\hat b}) }[/math]

As the cosine of 90° is zero, the dot product of two orthogonal(perpendicular) vectors is always zero. Moreover, two vectors can be considered orthogonal if and only if their dot product is zero, and they both have a nonzero length. This property provides a simple method to test the condition of orthogonality.

Sometimes these properties are also used for defining the dot product, especially in 2 and 3 dimensions; this definition is equivalent to the above one. For higher dimensions the formula can be used to define the concept of angle.

The geometric properties rely on the basis being orthonormal, i.e. composed of pairwise perpendicular vectors with unit length.

Scalar projection

If both a and b have length one (i.e., they are unit vectors), their dot product simply gives the cosine of the angle between them.

If only b is a unit vector, then the dot product a · b gives |a| cos(θ), i.e., the magnitude of the projection of a in the direction of b, with a minus sign if the direction is opposite. This is called the scalar projection of a onto b, or scalar component of a in the direction of b (see figure). This property of the dot product has several useful applications (for instance, see next section).

If neither a nor b is a unit vector, then the magnitude of the projection of a in the direction of b, for example, would be a · (b / |b|) as the unit vector in the direction of b is b / |b|.

Rotation

A rotation of the orthonormal basis in terms of which vector a is represented is obtained with a multiplication of a by a rotation matrix R. This matrix multiplication is just a compact representation of a sequence of dot products.

For instance, let

  • B1 = {x, y, z} and B2 = {u, v, w} be two different orthonormal bases of the same space R3, with B2 obtained by just rotating B1,
  • a1 = (ax, ay, az) represent vector a in terms of B1,
  • a2 = (au, av, aw) represent the same vector in terms of the rotated basis B2,
  • u1, v1, w1 be the rotated basis vectors u, v, w represented in terms of B1.

Then the rotation from B1 to B2 is performed as follows:

[math]\displaystyle{ \bold a_2 = \bold{Ra}_1 = \begin{bmatrix} u_x & u_y & u_z \\ v_x & v_y & v_z \\ w_x & w_y & w_z \end{bmatrix} \begin{bmatrix} a_x \\ a_y \\ a_z \end{bmatrix} = \begin{bmatrix} \bold u_1\cdot\bold a_1 \\ \bold v_1\cdot\bold a_1 \\ \bold w_1\cdot\bold a_1 \end{bmatrix} = \begin{bmatrix} a_u \\ a_v \\ a_w \end{bmatrix} . }[/math]

Notice that the rotation matrix R is assembled by using the rotated basis vectors u1, v1, w1 as its rows, and these vectors are unit vectors. By definition, Ra1 consists of a sequence of dot products between each of the three rows of R and vector a1. Each of these dot products determines a scalar component of a in the direction of a rotated basis vector (see previous section).

If a1 is a row vector, rather than a column vector, then R must contain the rotated basis vectors in its columns, and must post-multiply a1:

[math]\displaystyle{ \bold a_2 = \bold a_1 \bold R = \begin{bmatrix} a_x & a_y & a_z \end{bmatrix} \begin{bmatrix} u_x & v_x & w_x \\ u_y & v_y & w_y \\ u_z & v_z & w_z \end{bmatrix} = \begin{bmatrix} \bold u_1\cdot\bold a_1 & \bold v_1\cdot\bold a_1 & \bold w_1\cdot\bold a_1 \end{bmatrix} = \begin{bmatrix} a_u & a_v & a_w \end{bmatrix} . }[/math]

Physics

In physics, magnitude is a scalar in the physical sense, i.e. a physical quantity independent of the coordinate system, expressed as the product of a numerical value and a physical unit, not just a number. The dot product is also a scalar in this sense, given by the formula, independent of the coordinate system. Example:

Properties

The following properties hold if a, b, and c are real vectors and r is a scalar.

The dot product is commutative:

[math]\displaystyle{ \mathbf{a} \cdot \mathbf{b} = \mathbf{b} \cdot \mathbf{a}. }[/math]

The dot product is distributive over vector addition:

[math]\displaystyle{ \mathbf{a} \cdot (\mathbf{b} + \mathbf{c}) = \mathbf{a} \cdot \mathbf{b} + \mathbf{a} \cdot \mathbf{c}. }[/math]

The dot product is bilinear:

[math]\displaystyle{ \mathbf{a} \cdot (r\mathbf{b} + \mathbf{c}) = r(\mathbf{a} \cdot \mathbf{b}) +(\mathbf{a} \cdot \mathbf{c}). }[/math]

When multiplied by a scalar value, dot product satisfies:

[math]\displaystyle{ (c_1\mathbf{a}) \cdot (c_2\mathbf{b}) = (c_1c_2) (\mathbf{a} \cdot \mathbf{b}) }[/math]

(these last two properties follow from the first two).

Two non-zero vectors a and b are perpendicular if and only if ab = 0.

Unlike multiplication of ordinary numbers, where if ab = ac, then b always equals c unless a is zero, the dot product does not obey the cancellation law:

If ab = ac and a0, then we can write: a • (bc) = 0 by the distributive law; the result above says this just means that a is perpendicular to (bc), which still allows (bc) ≠ 0, and therefore bc.

Provided that the basis is orthonormal, the dot product is invariant under isometric changes of the basis: rotations, reflections, and combinations, keeping the origin fixed. The above mentioned geometric interpretation relies on this property. In other words, for an orthonormal space with any number of dimensions, the dot product is invariant under a coordinate transformation based on an orthogonal matrix. This corresponds to the following two conditions:

  • The new basis is again orthonormal (i.e., it is orthonormal expressed in the old one).
  • The new base vectors have the same length as the old ones (i.e., unit length in terms of the old basis).

If a and b are functions, then the derivative of ab is a'b + ab'

Triple product expansion

Template:Rellink

This is a very useful identity (also known as Lagrange's formula) involving the dot- and cross-products. It is written as

[math]\displaystyle{ \mathbf{a} \times (\mathbf{b} \times \mathbf{c}) = \mathbf{b}(\mathbf{a}\cdot\mathbf{c}) - \mathbf{c}(\mathbf{a}\cdot\mathbf{b}) }[/math]

which is easier to remember as "BAC minus CAB", keeping in mind which vectors are dotted together. This formula is commonly used to simplify vector calculations in physics.

Proof of the geometric interpretation

Consider the element of Rn

[math]\displaystyle{ \mathbf{v} = v_1 \mathbf{\hat{e}}_1 + v_2 \mathbf{\hat{e}}_2 + ... + v_n \mathbf{\hat{e}}_n. \, }[/math]

Repeated application of the Pythagorean theorem yields for its length |v|

[math]\displaystyle{ |\mathbf{v}|^2 = v_1^2 + v_2^2 + ... + v_n^2. \, }[/math]

But this is the same as

[math]\displaystyle{ \mathbf{v} \cdot \mathbf{v} = v_1^2 + v_2^2 + ... + v_n^2, \, }[/math]

so we conclude that taking the dot product of a vector v with itself yields the squared length of the vector.

Lemma 1
[math]\displaystyle{ \mathbf{v} \cdot \mathbf{v} = |\mathbf{v}|^2. \, }[/math]

Now consider two vectors a and b extending from the origin, separated by an angle θ. A third vector c may be defined as

[math]\displaystyle{ \mathbf{c} \ \stackrel{\mathrm{def}}{=}\ \mathbf{a} - \mathbf{b}. \, }[/math]

creating a triangle with sides a, b, and c. According to the law of cosines, we have

[math]\displaystyle{ |\mathbf{c}|^2 = |\mathbf{a}|^2 + |\mathbf{b}|^2 - 2 |\mathbf{a}||\mathbf{b}| \cos \theta. \, }[/math]

Substituting dot products for the squared lengths according to Lemma 1, we get

[math]\displaystyle{ \mathbf{c} \cdot \mathbf{c} = \mathbf{a} \cdot \mathbf{a} + \mathbf{b} \cdot \mathbf{b} - 2 |\mathbf{a}||\mathbf{b}| \cos\theta. \, }[/math]                   (1)

But as cab, we also have

[math]\displaystyle{ \mathbf{c} \cdot \mathbf{c} = (\mathbf{a} - \mathbf{b}) \cdot (\mathbf{a} - \mathbf{b}) \, }[/math],

which, according to the distributive law, expands to

[math]\displaystyle{ \mathbf{c} \cdot \mathbf{c} = \mathbf{a} \cdot \mathbf{a} + \mathbf{b} \cdot \mathbf{b} -2(\mathbf{a} \cdot \mathbf{b}). \, }[/math]                     (2)

Merging the two cc equations, (1) and (2), we obtain

[math]\displaystyle{ \mathbf{a} \cdot \mathbf{a} + \mathbf{b} \cdot \mathbf{b} -2(\mathbf{a} \cdot \mathbf{b}) = \mathbf{a} \cdot \mathbf{a} + \mathbf{b} \cdot \mathbf{b} - 2 |\mathbf{a}||\mathbf{b}| \cos\theta. \, }[/math]

Subtracting aa + bb from both sides and dividing by −2 leaves

[math]\displaystyle{ \mathbf{a} \cdot \mathbf{b} = |\mathbf{a}||\mathbf{b}| \cos\theta. \, }[/math]

Q.E.D.

Generalization

The inner product generalizes the dot product to abstract vector spaces and is usually denoted by [math]\displaystyle{ \langle\mathbf{a}\, , \mathbf{b}\rangle }[/math]. Due to the geometric interpretation of the dot product the norm ||a|| of a vector a in such an inner product space is defined as

[math]\displaystyle{ \|\mathbf{a}\| = \sqrt{\langle\mathbf{a}\, , \mathbf{a}\rangle} }[/math]

such that it generalizes length, and the angle θ between two vectors a and b by

[math]\displaystyle{ \cos{\theta} = \frac{\langle\mathbf{a}\, , \mathbf{b}\rangle}{\|\mathbf{a}\| \, \|\mathbf{b}\|}. }[/math]

In particular, two vectors are considered orthogonal if their inner product is zero

[math]\displaystyle{ \langle\mathbf{a}\, , \mathbf{b}\rangle = 0. }[/math]

For vectors with complex entries, using the given definition of the dot product would lead to quite different geometric properties. For instance the dot product of a vector with itself can be an arbitrary complex number, and can be zero without the vector being the zero vector; this in turn would have severe consequences for notions like length and angle. Many geometric properties can be salvaged, at the cost of giving up the symmetric and bilinear properties of the scalar product, by alternatively defining

[math]\displaystyle{ \mathbf{a}\cdot \mathbf{b} = \sum{a_i \overline{b_i}} }[/math]

where bi is the complex conjugate of bi. Then the scalar product of any vector with itself is a non-negative real number, and it is nonzero except for the zero vector. However this scalar product is not linear in b (but rather conjugate linear), and the scalar product is not symmetric either, since

[math]\displaystyle{ \mathbf{a} \cdot \mathbf{b} = \overline{\mathbf{b} \cdot \mathbf{a}} }[/math].

This type of scalar product is nevertheless quite useful, and leads to the notions of Hermitian form and of general inner product spaces.

The Frobenius inner product generalizes the dot product to matrices. It is defined as the sum of the products of the corresponding components of two matrices having the same size.

Generalization to tensors

The dot product between a tensor of order n and a tensor of order m is a tensor of order n+m-2. The dot product is worked out by multiplying and summing across a single index in both tensors. If [math]\displaystyle{ \mathbf{A} }[/math] and [math]\displaystyle{ \mathbf{B} }[/math] are two tensors with element representation [math]\displaystyle{ A_{ij\dots}^{k\ell\dots} }[/math] and [math]\displaystyle{ B_{mn\dots}^{p{\dots}i} }[/math] the elements of the dot product [math]\displaystyle{ \mathbf{A} \cdot \mathbf{B} }[/math] are given by

[math]\displaystyle{ A_{ij\dots}^{k\ell\dots}B_{mn\dots}^{p{\dots}i} = \sum_{i=1}^n A_{ij\dots}^{k\ell\dots}B_{mn\dots}^{p{\dots}i} }[/math]

This definition naturally reduces to the standard vector dot product when applied to vectors, and matrix multiplication when applied to matrices.

Occasionally, a double dot product is used to represent multiplying and summing across two indices. The double dot product between two 2nd order tensors is a scalar.

Related pages

Other websites