登录
首页 » matlab » kxakpcmq

kxakpcmq

于 2016-03-15 发布 文件大小:6KB
0 232
下载积分: 1 下载次数: 5

代码说明:

  BP神经网络用于函数拟合与模式识别,包含了阵列信号处理的常见算法,具有丰富的参数选项,在matlab R2009b调试通过,可以广泛的应用于数据预测及数据分析,FMCW调频连续波雷达的测距测角。( BP neural network function fitting and pattern recognition, Contains a common array signal processing algorithm, It has a wealth of parameter options, In matlab R2009b debugging through, Can be widely used in data analysis and forecast data, FMCW frequency modulated continuous wave radar range and angular measurements.)

文件列表:

kxakpcmq.m,9533,2016-03-15

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • WiMAX0FDMAMatlab
    Other simulation OFDMA
    2010-08-17 10:18:23下载
    积分:1
  • codegen
    本程序是自己编写的ca码matlab产生程序,希望在此能够给与大家共享(This program is ca matlab code to generate their own written procedures, we hope to be able to give this share)
    2012-09-03 13:51:47下载
    积分:1
  • ekfukf
    Extended Kalman filter Unscented Kalman filter toolbox
    2013-11-02 15:55:14下载
    积分:1
  • tgff-3.5
    task graph generator: task graph for free version 3.5
    2020-12-21 09:29:08下载
    积分:1
  • goGPS_v0.4.3
    开源的gps matlab程序,包括最小二乘卡尔曼滤波\lambda算法电离层修正\对流层修正星钟修正等 很好的GNSS学习参考(open source of gps, programme in matlab,include LS, ekf, lambda, iono,tron,etc good for gnss study)
    2015-09-05 15:56:57下载
    积分:1
  • root_L
    非zplane,实现零极点图的绘制,用root求零极点,用plot画坐标(Non zplane, zero pole diagram drawing, seeking zero-pole with root, with a plot coordinates painting)
    2014-11-25 00:54:33下载
    积分:1
  • kmeans
    kmeans is image segmentation algorithm.
    2014-02-06 15:27:22下载
    积分:1
  • 最速下降法
    说明:  最速下降法是迭代法的一种,可以用于求解最小二乘问题(线性和非线性都可以)。在求解机器学习算法的模型参数,即无约束优化问题时,梯度下降(Gradient Descent)是最常采用的方法之一,另一种常用的方法是最小二乘法。在求解损失函数的最小值时,可以通过梯度下降法来一步步的迭代求解,得到最小化的损失函数和模型参数值。反过来,如果我们需要求解损失函数的最大值,这时就需要用梯度上升法来迭代了。在机器学习中,基于基本的梯度下降法发展了两种梯度下降方法,分别为随机梯度下降法和批量梯度下降法。(The steepest descent method is a kind of iterative method, which can be used to solve the least squares problem (both linear and nonlinear). In solving the model parameters of machine learning algorithm, that is, unconstrained optimization, gradient descent is one of the most commonly used methods, and the other is the least square method. When solving the minimum value of loss function, the gradient descent method can be used step by step to get the minimum value of loss function and model parameters. Conversely, if we need to solve the maximum value of the loss function, then we need to use the gradient rise method to iterate. In machine learning, two kinds of ladders are developed based on the basic gradient descent method)
    2019-11-24 13:06:03下载
    积分:1
  • svm-confidence-interval
    lssvm 最小二乘支持向量机回归模型置信区间预测, 简单易用,易懂易学(least squares support vector machine)
    2017-03-11 20:02:28下载
    积分:1
  • robertpreittsobel
    this is the programs for robert prewwitt sobel operators for images from gonazalvez book digital image processing
    2010-10-30 12:55:29下载
    积分:1
  • 696516资源总数
  • 106914会员总数
  • 0今日下载