登录
首页 » matlab » HashingToolbox

HashingToolbox

于 2015-02-18 发布 文件大小:797KB
0 198
下载积分: 1 下载次数: 1

代码说明:

  A set of Matlab experiments that illustrates advanced computational signal and image processing.

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • SB1_110
    完成相关向量机的回归与分类的功能,采用经典例子(The relevance vector machine regression and classification)
    2012-04-27 20:27:23下载
    积分:1
  • Fuzzy-Logic-and-Backstepping
    利用非线性控制完成光伏并网逆变器的设计,设计中采用模糊控制和反步法,介绍很好。(Using nonlinear control to complete the design of the photovoltaic (pv) grid inverter, adopted in the design of advanced reverse moves, the introduction is very good.)
    2013-12-03 23:21:13下载
    积分:1
  • boqingwen
    对二阶rc等效电池模型运用matlab进行最小二乘法在线系统辨识的程序(Equivalent second-order rc battery model using least squares method matlab online system identification procedures )
    2020-09-12 16:28:01下载
    积分:1
  • RBNN
    Algorithm based in graph theory, in which form cluster around of a radius r of a point.
    2012-03-31 07:14:51下载
    积分:1
  • elearn
    Ii this file developing a sample website for elearning
    2014-09-13 18:23:17下载
    积分:1
  • svm
    基于svm的人体轮廓检测,基于omega人体形状特征。(recognize people by svm.)
    2015-03-28 09:31:07下载
    积分:1
  • sofmFaultDiagnosis
    sofm神经网络故障诊断完整示例,可运行(sofm neutral networks fault diagnosis)
    2015-01-13 15:01:16下载
    积分:1
  • Programming-syntax-ofmatlab
    一个比较好的关于matlab语法的ppt文档(Programming syntax ofmatlab)
    2013-05-03 16:56:35下载
    积分:1
  • K-means
    基于K-eans聚类算法的图像分割方法,对学习理解K-eans聚类能提供很大的帮助。(Image segmentation method based on K-eans clustering algorithm,which provides a great help in learning to understand the K-eans cluster. )
    2013-05-26 20:54:25下载
    积分:1
  • BFGS
    拟牛顿法和最速下降法(Steepest Descent Methods)一样只要求每一步迭代时知道目标函数的梯度。通过测量梯度的变化,构造一个目标函数的模型使之足以产生超线性收敛性。这类方法大大优于最速下降法,尤其对于困难的问题。另外,因为拟牛顿法不需要二阶导数的信息,所以有时比牛顿法(Newton s Method)更为有效。如今,优化软件中包含了大量的拟牛顿算法用来解决无约束,约束,和大规模的优化问题。(The quasi-Newton method and the Steepest Descent Methods only require that each step iterations know the gradient of the objective function. By measuring the change of the gradient, constructing a model of the objective function is sufficient to produce superlinear convergence. This method is much better than the steepest descent method, especially for difficult problems. In addition, because the quasi-Newton method does not require information on the second derivative, it is sometimes more effective than the Newton s Method. Today, the optimization software contains a large number of quasi-Newton algorithm used to solve the unconstrained, constraint, and large-scale optimization problems.)
    2017-05-05 10:28:29下载
    积分:1
  • 696516资源总数
  • 106914会员总数
  • 0今日下载