登录
首页 » Python » cnn regression

cnn regression

于 2017-12-12 发布 文件大小:1KB
0 313
下载积分: 1 下载次数: 35

代码说明:

  通过卷积神经网络回归进行的图像配准,相比传统的图像配准算法,能更快地获得配准参数,也能更好地减小配准误差(Image registration based on convolution neural network regression can get registration parameters faster and reduce registration error better than traditional image registration algorithm.)

文件列表:

inm.py, 2741, 2017-12-12

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • FFT842
    基二、基四、基八 FFT变化源代码,有详细注解(Second base, the base four or eight FFT-based changes in the source code, detailed comments)
    2021-02-20 14:09:43下载
    积分:1
  • aphysicaltool
    用于计算大物实验中常用的平均值和不确定度。(a simple tool to calculate average of input values.)
    2015-01-16 17:01:29下载
    积分:1
  • Sllcorrelation
    分析两时间序列之间的超前和之后关系,计算滞时长度,多用于水文气象方面分析研究(Analysis ahead of and after the relationship between the two time series, calculating the lag length, used for the analysis of hydrometeorological)
    2020-07-10 10:08:55下载
    积分:1
  • 多目标蚁群
    多目标蚁群算法用来求解多目标最小生成树问题(Multi-objective ant colony algorithm)
    2017-11-28 09:47:16下载
    积分:1
  • sin_fft
    自己编写的一个正弦函数频谱的估计,利用FFT函数实现,有详细注释,方便理解,附带仿真图一张(A sine function spectrum estimate their preparation, the use of FFT function implementation, detailed notes, easy to understand, with a simulation map)
    2014-11-04 09:45:07下载
    积分:1
  • five-ways
    数值分析非线性方程求解(五种方法)。分析牛顿法、割线法、对分法、Steffensen法、简易牛顿法解线性方程组的性质(Numerical analysis of nonlinear equations (five methods). Nature of the analysis of Newton, secant, bisection method, Steffensen method, simple Newton method for solving linear equations)
    2013-12-10 14:41:01下载
    积分:1
  • Conjugate-Gradient-Method
    共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。 在各种优化算法中,共轭梯度法是非常重要的一种。其优点是所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数。(Conjugate gradient method (Conjugate Gradient) is between the steepest descent method between the method and Newton' s method, it takes only a first derivative information, but to overcome the steepest descent method convergence slow shortcomings, but also to avoid the Newton method needs to be stored Hesse and disadvantages of computing inverse matrix and the conjugate gradient method is not only one of the most useful methods to solve large linear equations, solution of large-scale nonlinear optimization is one of the most effective algorithm. In various optimization algorithm, conjugate gradient method is a very important one. The advantage is that a small amount of memory required, with step convergence, high stability, and does not require any external parameters.)
    2017-03-14 15:48:15下载
    积分:1
  • ieee33
    应用牛顿法的直角坐标系下的33节点潮流计算(33 node flow calculation)
    2011-08-03 15:21:50下载
    积分:1
  • SQUARESQUEEZE
    使用Fortran编程,有限差分法计算雷诺方程(Reynolds equation calculated using Fortran programming, finite difference method)
    2016-10-18 11:16:37下载
    积分:1
  • matlab常微分方程
    对常微分方程进行数值解的matlab函数(Matlab function for numerical solution of ordinary differential equation)
    2020-06-16 23:00:02下载
    积分:1
  • 696516资源总数
  • 106914会员总数
  • 0今日下载