descent n. 1.下降,降下。 2.下坡,傾斜。 3.家世,門第,血統(tǒng)。 4.【法律】繼承,世襲。 5.一代;〔古語〕子孫,后裔。 6.突然襲擊。 7.屈尊,降格。 be of France descent 祖籍是法國。 be of good descent 出身好。 descent of man 人類由來。 make a descent upon 襲擊,侵入。
Is based on the idea of gradient descent 規(guī)則是基于梯度降落這樣一種思路。
Based on gradient descent rule , the bp ( back propagation ) algorithm is a local optimization algorithm Bp算法基于梯度下降原理,是一種局部尋優(yōu)算法。
A comparative study on some typical improved models of bp networks , which based on gradient descent and numerical optimization are proposed 全局優(yōu)化改進策略從基于網(wǎng)絡(luò)模型的優(yōu)化和基于網(wǎng)絡(luò)算法的優(yōu)化兩個方面考慮。
The weights are trained with gradient descent method . the increase algorithm of bvs , and restricted algorithm , was induced 利用梯度下降法對網(wǎng)絡(luò)的權(quán)值進行訓練,并且推導(dǎo)了bvs的增長算法,以及網(wǎng)絡(luò)訓練的限制記憶遞推公式。
The essence of back propagation networks is that make the change of weights become little by gradient descent method and finally attain the minimal error 其實質(zhì)是采用梯度下降法使權(quán)值的改變總是朝著誤差變小的方向改進,最終達到最小誤差。
The parameter of local model can be calculated by gradient descent in neighborhood with the sofm weight together , or estimated by least - squared estimation ( lse ) 局部模型的參數(shù)既可和映射網(wǎng)絡(luò)權(quán)值一起在鄰域內(nèi)采用梯度下降法修正,也可結(jié)合最小二乘法得到其最佳估計。
Video technol . , aug . 1994 , 4 : 438 - 442 . 4 liu l k , peig e . a block - based gradient descent search algorithm for block motion estimation in video coding 這些算法適合于用在運動較為平緩的情況下,但是在運動較為劇烈的情況下,這些算法很有可能陷入局部最優(yōu),因而造成搜索精度損失。
The back - propagation algorithm also rests on the idea of gradient descent , and so the only change in the analysis of weight modification concerns the difference between t and y 反向傳播算法同樣來源于梯度降落原理,在權(quán)系數(shù)調(diào)整分析中的唯一不同是涉及到t ( p , n )與y ( p , n )的差分。
A gradient descent learning algorithm is employed to train the network . finally , the whole thesis is summarized , and some future research areas are highlighted 最后,對全文作出總結(jié),歸納了本文解決的問題,指出了基于多元統(tǒng)計投影軟測量技術(shù)的今后值得關(guān)注和深入研究的方向。
In this scheme , the inputs of hidden layer neurons are acquired by using the gradient descent method , and the weights and threshold of each neuron are trained using the linear least square method 在該方案中,通過梯度法獲取隱層神經(jīng)元的輸入,使用線性最小二乘法訓練各神經(jīng)元的權(quán)值和閾值。
Gradient descent is a first-order optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient (or of the approximate gradient) of the function at the current point.