標題: Titlebook: Machine Learning and Knowledge Discovery in Databases; European Conference, Hendrik Blockeel,Kristian Kersting,Filip ?elezny Conference pro [打印本頁] 作者: 根深蒂固 時間: 2025-3-21 18:07
書目名稱Machine Learning and Knowledge Discovery in Databases影響因子(影響力)
書目名稱Machine Learning and Knowledge Discovery in Databases影響因子(影響力)學科排名
書目名稱Machine Learning and Knowledge Discovery in Databases網絡公開度
書目名稱Machine Learning and Knowledge Discovery in Databases網絡公開度學科排名
書目名稱Machine Learning and Knowledge Discovery in Databases被引頻次
書目名稱Machine Learning and Knowledge Discovery in Databases被引頻次學科排名
書目名稱Machine Learning and Knowledge Discovery in Databases年度引用
書目名稱Machine Learning and Knowledge Discovery in Databases年度引用學科排名
書目名稱Machine Learning and Knowledge Discovery in Databases讀者反饋
書目名稱Machine Learning and Knowledge Discovery in Databases讀者反饋學科排名
作者: Lyme-disease 時間: 2025-3-21 21:04
Parallel Boosting with Momentumorithm, which we call BOOM, for .sting with .omentum, enjoys the merits of both techniques. Namely, BOOM retains the momentum and convergence properties of the accelerated gradient method while taking into account the curvature of the objective function. We describe a . implementation of BOOM which 作者: 樂意 時間: 2025-3-22 01:14
Inner Ensembles: Using Ensemble Methods Inside the Learning Algorithmied in many more situations than they have been previously. Instead of using them only to combine the output of an algorithm, we can apply them to the decisions made inside the learning algorithm, itself. We call this approach Inner Ensembles. The main contribution of this work is to demonstrate how作者: EXTOL 時間: 2025-3-22 05:52 作者: 擁護者 時間: 2025-3-22 10:59 作者: yohimbine 時間: 2025-3-22 14:44
Bundle CDN: A Highly Parallelized Approach for Large-Scale ?1-Regularized Logistic Regression by their divergence under high degree of parallelism (DOP), or need data pre-process to avoid divergence. To better exploit parallelism, we propose a coordinate descent based parallel algorithm without needing of data pre-process, termed as Bundle Coordinate Descent Newton (BCDN), and apply it to l作者: Foolproof 時間: 2025-3-22 18:00
MORD: Multi-class Classifier for Ordinal Regression only allows to design new learning algorithms for ordinal regression using existing methods for multi-class classification but it also allows to derive new models for ordinal regression. For example, one can convert learning of ordinal classifier with (almost) arbitrary loss function to a convex un作者: tic-douloureux 時間: 2025-3-22 21:51
Identifiability of Model Properties in Over-Parameterized Model Classess (.,.(.))), and the space of queries for the learned model (predicting function values for new examples .). However, in many learning scenarios the 3-way association between hypotheses, data, and queries can really be much looser. Model classes can be over-parameterized, i.e., different hypotheses 作者: 祝賀 時間: 2025-3-23 05:19
Exploratory Learninged examples are provided for some classes. In this paper we present variants of well-known semi-supervised multiclass learning methods that are robust when the data contains an unknown number of classes. In particular, we present an “exploratory” extension of expectation-maximization (EM) that explo作者: Myocyte 時間: 2025-3-23 06:01
Semi-supervised Gaussian Process Ordinal Regressionn while unlabeled ordinal data are available in abundance. Designing a probabilistic semi-supervised classifier to perform ordinal regression is challenging. In this work, we propose a novel approach for semi-supervised ordinal regression using Gaussian Processes (GP). It uses the expectation-propag作者: Glutinous 時間: 2025-3-23 11:25 作者: justify 時間: 2025-3-23 16:23
Tractable Semi-supervised Learning of Complex Structured Prediction Modelsallow the direct use of tractable inference/learning algorithms (e.g., binary label or linear chain). Therefore, these methods cannot be applied to problems with complex structure. In this paper, we propose an approximate semi-supervised learning method that uses piecewise training for estimating th作者: 咆哮 時間: 2025-3-23 20:49
PSSDL: Probabilistic Semi-supervised Dictionary Learninglability of the large labeled datasets. However, in many real world applications, accessing to sufficient labeled data may be expensive and/or time consuming, but its relatively easy to acquire a large amount of unlabeled data. In this paper, we propose a probabilistic framework for discriminative d作者: 小蟲 時間: 2025-3-24 01:42
Embedding with Autoencoder Regularizationan guarantee the “semantics” of the original high-dimensional data. Most of the existing embedding algorithms perform to maintain the . property. In this study, inspired by the remarkable success of representation learning and deep learning, we propose a framework of embedding with autoencoder regul作者: Gum-Disease 時間: 2025-3-24 03:31 作者: RODE 時間: 2025-3-24 07:08 作者: FLAIL 時間: 2025-3-24 14:14
Locally Linear Landmarks for Large-Scale Manifold Learninga graph Laplacian. With large datasets, the eigendecomposition is too expensive, and is usually approximated by solving for a smaller graph defined on a subset of the points (landmarks) and then applying the Nystr?m formula to estimate the eigenvectors over all points. This has the problem that the 作者: Loathe 時間: 2025-3-24 15:10 作者: 松緊帶 時間: 2025-3-24 21:06 作者: 教唆 時間: 2025-3-25 01:13
Parallel Boosting with Momentumes of the accelerated gradient method while taking into account the curvature of the objective function. We describe a . implementation of BOOM which is suitable for massive high dimensional datasets. We show experimentally that BOOM is especially effective in large scale learning problems with rare yet informative features.作者: 啟發(fā) 時間: 2025-3-25 03:20 作者: 吹牛大王 時間: 2025-3-25 09:29 作者: 失敗主義者 時間: 2025-3-25 14:45
0302-9743 dings of the European Conference on Machine Learning and Knowledge Discovery in Databases, ECML PKDD 2013, held in Prague, Czech Republic, in September 2013. The 111 revised research papers presented together with 5 invited talks were carefully reviewed and selected from 447 submissions. The papers 作者: otic-capsule 時間: 2025-3-25 16:56 作者: Amnesty 時間: 2025-3-25 20:43
Bundle CDN: A Highly Parallelized Approach for Large-Scale ?1-Regularized Logistic Regressionional Armijo line search to obtain the stepsize. By theoretical analysis on global convergence, we show that BCDN is guaranteed to converge with a high DOP. Experimental evaluations over five public datasets show that BCDN can better exploit parallelism and outperforms state-of-the-art algorithms in speed, without losing testing accuracy.作者: 獨行者 時間: 2025-3-26 03:09
MORD: Multi-class Classifier for Ordinal Regressions on standard benchmarks as well as in solving a real-life problem. In particular, we show that the proposed piece-wise ordinal classifier applied to visual age estimation outperforms other standard prediction models.作者: 斗爭 時間: 2025-3-26 04:57 作者: 鋼盔 時間: 2025-3-26 08:47
AR-Boost: Reducing Overfitting by a Robust Data-Driven Regularization Strategynables a natural extension to multiclass boosting, and further reduces overfitting in both the binary and multiclass cases. We derive bounds for training and generalization errors, and relate them to AdaBoost. Finally, we show empirical results on benchmark data that establish the robustness of our approach and improved performance overall.作者: Ceramic 時間: 2025-3-26 13:18
Exploratory Learningres different numbers of classes while learning. “Exploratory” SSL greatly improves performance on three datasets in terms of F1 on the classes . seed examples—i.e., the classes which are expected to be in the data. Our Exploratory EM algorithm also outperforms a SSL method based non-parametric Bayesian clustering.作者: Bone-Scan 時間: 2025-3-26 18:06
PSSDL: Probabilistic Semi-supervised Dictionary Learningictionary learning which uses both the labeled and unlabeled data. Experimental results demonstrate that the performance of the proposed method is significantly better than the state of the art dictionary based classification methods.作者: Antimicrobial 時間: 2025-3-26 21:50 作者: Harridan 時間: 2025-3-27 04:46 作者: atopic-rhinitis 時間: 2025-3-27 05:31 作者: 繞著哥哥問 時間: 2025-3-27 10:11
Lecture Notes in Computer Sciencehttp://image.papertrans.cn/m/image/620518.jpg作者: visceral-fat 時間: 2025-3-27 14:26
978-3-642-40993-6Springer-Verlag Berlin Heidelberg 2013作者: 沙發(fā) 時間: 2025-3-27 20:07
Machine Learning and Knowledge Discovery in Databases978-3-642-40994-3Series ISSN 0302-9743 Series E-ISSN 1611-3349 作者: demote 時間: 2025-3-28 01:38 作者: glans-penis 時間: 2025-3-28 04:06
The Stochastic Gradient Descent for the Primal L1-SVM Optimization Revisitedilable an upper bound on the relative accuracy achieved which provides a meaningful stopping criterion. In addition, we propose a mechanism of presenting the same pattern repeatedly to the algorithm which maintains the above properties. Finally, we give experimental evidence that algorithms construc作者: angiography 時間: 2025-3-28 07:47
Identifiability of Model Properties in Over-Parameterized Model Classes the identification of temporal logic properties of probabilistic automata learned from sequence data, the identification of causal dependencies in probabilistic graphical models, and the transfer of probabilistic relational models to new domains.作者: Neuropeptides 時間: 2025-3-28 10:47 作者: 生存環(huán)境 時間: 2025-3-28 16:28
Influence of Graph Construction on Semi-supervised Learningthms on a variety of graph construction methods and parameter values. The obtained results show that the mutual .-nearest neighbors (mutKNN) graph may be the best choice for adjacency graph construction while the RBF kernel may be the best choice for weighted matrix generation. In addition, mutKNN t作者: 漫不經心 時間: 2025-3-28 22:01
Tractable Semi-supervised Learning of Complex Structured Prediction Modelsf using approximations, the approach is effective and yields good improvements in generalization performance over the plain supervised method. In addition, we demonstrate that our inference engine can be applied to other semi-supervised learning frameworks, and extends them to solve problems with co作者: Lethargic 時間: 2025-3-29 00:08
Embedding with Autoencoder Regularizationr reconstruction error. It is worth mentioning that instead of operating in a batch mode as most of the previous embedding algorithms conduct, the proposed framework actually generates an . embedding model and thus supports incremental embedding efficiently. To show the effectiveness of EAER, we ada作者: emulsify 時間: 2025-3-29 03:13
Discovering Skylines of Subgroup Setsthms, and the accuracy of the levelwise method. Furthermore, we show that the skylines can be used for the objective evaluation of subgroup set heuristics. Finally, we show characteristics of the obtained skylines, which reveal that different quality-diversity trade-offs result in clearly different 作者: mastoid-bone 時間: 2025-3-29 08:49
Difference-Based Estimates for Generalization-Aware Subgroup Discoveryions. We show, how this technique can be applied for the most popular interestingness measures for binary as well as for numeric target concepts. The novel bounds are incorporated in an efficient algorithm, which outperforms previous methods by up to an order of magnitude.作者: BANAL 時間: 2025-3-29 12:47
Conference proceedings 2013statistical learning; semi-supervised learning; unsupervised learning; subgroup discovery, outlier detection and anomaly detection; privacy and security; evaluation; applications; and medical applications.作者: malapropism 時間: 2025-3-29 16:44 作者: patella 時間: 2025-3-29 22:06 作者: COST 時間: 2025-3-30 03:53
Baidya Nath Saha,Gautam Kunapuli,Nilanjan Ray,Joseph A. Maldjian,Sriraam Natarajan作者: CREEK 時間: 2025-3-30 06:22
Indraneel Mukherjee,Kevin Canini,Rafael Frongillo,Yoram Singer作者: Platelet 時間: 2025-3-30 09:23 作者: 使習慣于 時間: 2025-3-30 15:51 作者: 珊瑚 時間: 2025-3-30 20:28 作者: 水獺 時間: 2025-3-31 00:19
Wenchao Yu,Guangxiang Zeng,Ping Luo,Fuzhen Zhuang,Qing He,Zhongzhi Shi作者: Sputum 時間: 2025-3-31 02:35
Yinjie Huang,Cong Li,Michael Georgiopoulos,Georgios C. Anagnostopoulos作者: 敵手 時間: 2025-3-31 05:51
8樓作者: commensurate 時間: 2025-3-31 11:43
9樓作者: 暫時別動 時間: 2025-3-31 14:12
9樓作者: 隱士 時間: 2025-3-31 18:32
9樓作者: 財政 時間: 2025-3-31 22:52
9樓作者: delusion 時間: 2025-4-1 02:27
10樓作者: kidney 時間: 2025-4-1 07:05
10樓作者: 臭了生氣 時間: 2025-4-1 13:22
10樓作者: 闖入 時間: 2025-4-1 14:32
10樓