0%

I want to install AML to explore its capabilities,to gauge whether it will help my machine learning works.
I follow the official installation steps.
But I am unfavorable.Repeated failed many times and got the following log(There are a lot of logs, I only intercepted some of them that seem to be related to the reason for the failure.):

......
2018/7/13 2:37:56 30: Installer [Information] - 0: Executing command: /Users/liguang/Library/Caches/AmlWorkbench/Python/bin/python -s -E -m conda install --no-deps --yes --force --offline "/private/tmp/AmlInstaller/six.macos-1.11.0/six-1.11.0-py35_1.tar.bz2"
2018/7/13 2:37:58 33: Installer [Information] - 0: Returned exit code 1
2018/7/13 2:37:58 33: Installer [Information] - 0: Output: [02:37:58] StandardError: An unexpected error has occurred.
[02:37:58] StandardError: Please consider posting the following information to the
[02:37:58] StandardError: conda GitHub issue tracker at:
[02:37:58] StandardError:
[02:37:58] StandardError: https://github.com/conda/conda/issues
[02:37:58] StandardError:
[02:37:58] StandardError:
[02:37:58] StandardError:
[02:37:58] StandardError: Current conda install:
[02:37:58] StandardError:
[02:37:58] StandardError: platform : osx-64
[02:37:58] StandardError: conda version : 4.3.27
[02:37:58] StandardError: conda is private : False
[02:37:58] StandardError: conda-env version : 4.3.27
[02:37:58] StandardError: conda-build version : not installed
[02:37:58] StandardError: python version : 3.5.2.final.0
[02:37:58] StandardError: requests version : 2.11.1
[02:37:58] StandardError: root environment : /Users/liguang/Library/Caches/AmlWorkbench/Python (writable)
[02:37:58] StandardError: default environment : /Users/liguang/Library/Caches/AmlWorkbench/Python
[02:37:58] StandardError: envs directories : /Users/liguang/Library/Caches/AmlWorkbench/Python/envs
[02:37:58] StandardError: /Users/liguang/.conda/envs
[02:37:58] StandardError: package cache : /Users/liguang/Library/Caches/AmlWorkbench/Python/pkgs
[02:37:58] StandardError: /Users/liguang/.conda/pkgs
[02:37:58] StandardError: channel URLs : https://repo.continuum.io/pkgs/main/osx-64 (offline)
[02:37:58] StandardError: https://repo.continuum.io/pkgs/main/noarch (offline)
[02:37:58] StandardError: https://repo.continuum.io/pkgs/free/osx-64 (offline)
[02:37:58] StandardError: https://repo.continuum.io/pkgs/free/noarch (offline)
[02:37:58] StandardError: https://repo.continuum.io/pkgs/r/osx-64 (offline)
[02:37:58] StandardError: https://repo.continuum.io/pkgs/r/noarch (offline)
[02:37:58] StandardError: https://repo.continuum.io/pkgs/pro/osx-64 (offline)
[02:37:58] StandardError: https://repo.continuum.io/pkgs/pro/noarch (offline)
[02:37:58] StandardError: config file : None
[02:37:58] StandardError: netrc file : None
[02:37:58] StandardError: offline mode : True
[02:37:58] StandardError: user-agent : conda/4.3.27 requests/2.11.1 CPython/3.5.2 Darwin/17.6.0 OSX/10.13.5
[02:37:58] StandardError: UID:GID : 501:20
[02:37:58] StandardError:
[02:37:58] StandardError: $ /Users/liguang/Library/Caches/AmlWorkbench/Python/lib/python3.5/site-packages/conda/__main__.py install --no-deps --yes --force --offline /private/tmp/AmlInstaller/six.macos-1.11.0/six-1.11.0-py35_1.tar.bz2
[02:37:58] StandardError:
......

In the case of web search solutions, I did not actually see a solution that directly helped.I once wanted to give up.And, in fact, I don’t know what the wrong keyword is, it’s frustrating.

Fortunately, I insisted on it again.
I went back to read the log again and felt that the reason for the failure might be related to conda.
With a try attitude:

  • manually installed anaconda3.
  • then, install AML again

This is a success for me, I don’t know why, just try to succeed.Maybe not for others, because I don’t know why.

In addition, after installing anaconda3, it seems that some of the original python packages disappear, such as xgboost,LightGBM,tensorflow, etc. I also don’t know why, just install them again.+_+

正则化

加入正则化因子对高项进行降权,以避免过拟合。

因为高项总是倾向于尽可能的拟合训练数据而导致过拟合现象。

$θ_0$不参与正则化。

线性回归假设函数:$h_\theta(x) = \theta^TX$

逻辑回归假设函数:$h_\theta(x) = \frac{1}{1+e^{-\theta^TX}}$

:两者的参数更新规则看起来基本相同,但由于假设函数的区别,所以两者参数更新所使用的梯度下降实际上是不一样的。

粒子群优化算法是一种全局随机搜索算法。简写PSO(Particle Swarm Optimization)

其产生来源借助了鸟群的群体思想。

利用群算法思想进行迭代,逐步搜索得到全局(局部)最优解。

每一步迭代都会依据条件(是否优于之前全局最优位置)决定是否更新全局最优位置,并根据当前迭代步的最优位置及前一步的全局最优位置更新速度(v)及位置(x)。

$v_{i}^{(k+1)} = wv_{i}^{(k)} + c_1r_{i}^{(k)}(p_{i}^{(k)} - x_{i}^{(k)}) + c_2s_{i}^{(k)}(g_{i}^{(k)} - x_{i}^{(k)})$
$x_{i}^{(k+1)} = x_{i}^{(k)} + v_{i}^{(k+1)}$

$k$:表示迭代步数
$i$:表示种群中第$i$个粒子
$x_{i}$:表示种群中第$i$个粒子的位置,是一个$n$维向量,即空间位置
$r_{i},s_{i}$:表示$n$维随机向量,用于加入随机性改变位置,以达到全局搜索的目的
$c_1,c_2$:表示当前迭代步的最优位置与前一步全局最优位置的权重

以上便是PSO算法。

ps:个人有点小问题没太想得清楚,如果在n维空间中都随机向随机方向及步长进行搜索,效率高吗?会一直向最优的方向前进吗?待思考[20180716],偏导数?