-
baseline_200702
矩阵变换函数,可通过该函数实现三位是两的绕轴变换(Matrix transformation function, can be accessed through the function of the three two-axis transformation)
- 2008-01-08 01:53:06下载
- 积分:1
-
5dof-newmark-example
MATLAB采用newmark方法进行5自由度动力学方程的求解(MATLAB uses newmark method for solving dynamic equations 5 degrees of freedom)
- 2014-04-18 18:07:26下载
- 积分:1
-
DMC1
dmc程序片段,实现液位的dmc动态矩阵控制(dmc data)
- 2013-03-18 15:32:32下载
- 积分:1
-
K-PNN
K-PNN Algorithm
a type of k nearest neighbor algorithm
- 2010-02-04 16:35:14下载
- 积分:1
-
16PSK-BER-using-Gray-Mapping
Its a source code in MATLAB
- 2012-06-08 05:00:04下载
- 积分:1
-
weifen-chafen
差分 微分模型的代码,很好用,学习很快。可尝试学习。(Differential differential model code, useful, learning quickly. You can try to learn.)
- 2013-08-14 20:41:49下载
- 积分:1
-
multiple-acess-tech-in-matlab-CDMA
multiple acess technewues coded in matlab
- 2013-03-06 17:21:41下载
- 积分:1
-
BP_for_4
使用神经网络实现对函数sin(x1)/x1*sin(x2)/x2的模拟,代码为matlab代码,输入为二维数组,输出为一个数,结果误差为0.225左右,(Using neural networks to function sin (x1) Analog/x1* sin (x2)/x2, the code for matlab code, enter a two-dimensional array, the output is a number, the result error is about 0.225,)
- 2014-11-19 13:06:30下载
- 积分:1
-
map-with--adj-matrix
根据邻接矩阵作图 matlab实现 源代码(mapping according to the adjacency matrix )
- 2014-02-09 09:11:51下载
- 积分:1
-
FR
说明: 共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。 在各种优化算法中,共轭梯度法是非常重要的一种。其优点是所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数。(The Conjugate Gradient method is a method between the steepest descent method and the Newton method. It only needs to use the first derivative information, but overcomes the shortcomings of the steepest descent method and avoids the need for the Newton method to store And the calculation of Hesse matrix and the inverse of the shortcomings, conjugate gradient method is not only to solve large-scale linear equations one of the most useful methods, but also solution for large-scale nonlinear optimization of one of the most effective algorithm. In a variety of optimization algorithms, conjugate gradient method is a very important one. The advantage is that the required storage capacity is small, with step convergence, high stability, and does not require any external parameters.)
- 2017-05-05 10:26:25下载
- 积分:1