Alternatives to Predictive Suite

Compare Predictive Suite alternatives for your business or organization using the curated list below. SourceForge ranks the best alternatives to Predictive Suite in 2026. Compare features, ratings, user reviews, pricing, and more from Predictive Suite competitors and alternatives in order to make an informed decision for your business.

  • 1
    NeuroIntelligence
    NeuroIntelligence is a neural networks software application designed to assist neural network, data mining, pattern recognition, and predictive modeling experts in solving real-world problems. NeuroIntelligence features only proven neural network modeling algorithms and neural net techniques; software is fast and easy-to-use. Visualized architecture search, neural network training and testing. Neural network architecture search, fitness bars, network training graphs comparison. Training graphs, dataset error, network error, weights and errors distribution, neural network input importance. Testing, actual vs. output graph, scatter plot, response graph, ROC curve, confusion matrix. The interface of NeuroIntelligence is optimized to solve data mining, forecasting, classification and pattern recognition problems. You can create a better solution much faster using the tool's easy-to-use GUI and unique time-saving capabilities.
    Starting Price: $497 per user
  • 2
    NXG Logic Explorer
    NXG Logic Explorer is a Windows-based machine learning package designed for data analytics, predictive analytics, unsupervised class discovery, supervised class prediction, and simulation. It enhances productivity by reducing the time required for various procedures, enabling users to identify novel patterns in exploratory datasets and perform hypothesis testing, simulations, and text mining to extract meaningful insights. Key features include automatic de-stringing of messy Excel input files, parallel feature analysis for generating summary statistics, Shapiro-Wilk tests, histograms, and count frequencies for multiple continuous and categorical variables. It allows simultaneous execution of ANOVA, Welch ANOVA, chi-squared, and Bartlett's tests on multiple variables, and automatically generates multivariable linear, logistic, and Cox proportional hazards regression models based on a default p-value criterion for filtering from univariate models.
  • 3
    PureMind

    PureMind

    PureMind

    Computer vision and artificial intelligence (AI) helps train equipment to control the quality of products in manufacture, train robots for movement autonomous and safety, train cameras to control and analyze traffic on retail, recognize types and colors of cars, food in the fridge, or make a map or 3D model of space from video. Algorithms help to predict sales in your business, find the relationship between metrics, publications and grow, classify customers for prepare personal offers, interpret and visualize the data, extract most important from text and video. Data Mining, regression, classification, correlation and cluster analysis, decision trees, prediction models, graphs, neural networks. Text classification, understanding, summarization and auto-tagging, named-entity recognition, compare for text similarity, sentiment analysis, dialog and QA systems. Detection, segmentation, recognition, recovery and image/video generation.
  • 4
    NeuroShell Trader

    NeuroShell Trader

    NeuroShell Trader

    If you have a set of favorite indicators but don't have a set of profitable trading rules, the pattern recognition of an artificial neural network may be the solution. Neural networks analyze your favorite indicators, recognize multi-dimensional patterns too complex to visualize, predict, and forecast market movements, and then generate trading rules based on those patterns, predictions, and forecasts. With NeuroShell Trader's proprietary fast training 'Turboprop 2' neural network you no longer need to be a neural network expert. Inserting neural network trading is as easy as inserting an indicator. NeuroShell Trader's point-and-click interface allows you to easily create automated trading systems based on technical analysis indicators and neural network market forecasts without any code or programming.
    Starting Price: $1,495 one-time payment
  • 5
    Plug&Score Modeler
    An extremely easy-to-use scoring system. A tool created by scoring experts for small and medium size credit organizations. Exactly what you need for accurate real-time scoring (and only for scoring) without obsolete hard-to-learn features. The best cost to value ratio on the market. Up and running within a few days. Easy to use due to its wizard-based scorecard modeling interface. Monitor and validate scorecards using a set of pre-defined reports. Reject Inference with automated and manual inference methods. Automated binning based on chi-square and manual binning based on WOE. Automatic and manual sampling. Graphical statistics. Portfolio data filtering, sorting and re-assignment to “Good” and “Bad”. Transformation of numeric variables into categorical ones. Correlation coefficients (correlation coefficient is provided for each pair of variables for the dataset before and after the binning procedure).
    Starting Price: $9950 one-time payment
  • 6
    GeneXproTools
    Microsoft award winning GeneXproTools is an extremely flexible modeling tool designed for regression, logistic regression, classification, time series prediction, and logic synthesis. GeneXproTools is very easy to use and and is in fact as easy as importing your data and then clicking a button (the Start button) to create a great model. GeneXproTools is available in five editions, Home, Standard, Advanced, Professional, and Enterprise. Academic Versions are also available at half price for education institutions and students. GeneXproTools can process datasets with tens of thousands of variables and effortlessly extract the most significant features and their relationships. GeneXproTools is also a very user-friendly application simplifying the access to all types of data stores from raw text files to databases and Excel spreadsheets. You don't need to know any programming language to create powerful and accurate models.
    Starting Price: €659 per year
  • 7
    Quark Analytics

    Quark Analytics

    Quark Analytics

    In secure and manageable environment the user can quickly extract insights from data. You can gather the data through several formats and data types; create new variables; transform and select the cases of interest! You can examine and analyze data through proper data analysis for numerical and categorical variables. Check the results in a table or graphical manner. Explore the relationships between variables. Test the significance of the relationship. Correlations (Pearson, Spearman), Chi-Square tests, T-Test for independent Samples, Mann-Whitney, Anova, Kruskal-Wallis, and much more. You can easily select and calculate the most commonly measures of scale reliability. Test if the Dimensions on your data are consistent. Use Measures such as Cronbach Alpha Raw & Standardized, if item deleted, Guttman’s six, Intraclass correlation coefficients (ICC), and much more.
    Starting Price: $29.90/month/user
  • 8
    DAVinCI LABS
    When you select a target you want to predict, it learns patterns in the data through a rich algorithm to create a predictive model. If you specify a variable that is a criterion for decision making, it automatically classifies clusters showing significant trends and presents the characteristics of each cluster as a rule. If the characteristics of the data change with time, you can predict the target value at a specific point in the future by analyzing the trend according to the time variable. Even when the characteristics of the data are unclear, it automatically classifies clusters with special tendencies, which can be used to identify outliers in new data or gain new insights. Our company's marketing targets are selected by considering gender, age, and whether or not there is a mortgage loan. Are there more variables to consider?
  • 9
    IBM Watson Machine Learning Accelerator
    Accelerate your deep learning workload. Speed your time to value with AI model training and inference. With advancements in compute, algorithm and data access, enterprises are adopting deep learning more widely to extract and scale insight through speech recognition, natural language processing and image classification. Deep learning can interpret text, images, audio and video at scale, generating patterns for recommendation engines, sentiment analysis, financial risk modeling and anomaly detection. High computational power has been required to process neural networks due to the number of layers and the volumes of data to train the networks. Furthermore, businesses are struggling to show results from deep learning experiments implemented in silos.
  • 10
    Imubit

    Imubit

    Imubit

    Imubit’s AI platform delivers real-time, closed-loop process optimization for heavy-process industries by combining a dynamic process simulator, reinforcement-learning neural controller, and performance dashboards. The dynamic simulator is trained on years of historical plant data and guided by first principles to build a virtual model of the true process, enabling what-if simulation of variable relationships, constraint changes, and operating strategy shifts. The reinforcement-learning controller, trained offline with millions of trial-and-error scenarios, is then deployed to optimize control variables continuously, maximizing margins while respecting safe-operating constraints. Live dashboards track model availability, engagement, uptime and offer interactive visualizations of bound values, operational limits, and KPI trends. Use cases include aligning economic strategy with real-time operations and detecting process degradation.
  • 11
    SAS Visual Statistics
    With SAS Visual Statistics, multiple users can explore data, then interactively create and refine predictive models. Your data scientists and statisticians can act on observations at a granular level using the most appropriate analytical modeling techniques. The result? You'll unearth insights at unprecedented speeds, and find new ways to grow revenue. Easily build and refine models to target specific groups or segments, and run numerous scenarios simultaneously. You can ask more what-if questions to get better results. And put results into action with an automatically generated score code. Empower multiple users to interact with data visually – to add or change variables, remove outliers, etc. Instantly see how changes affect your model's predictive power, and make refinements quickly. Data science teams have the ultimate flexibility of working in their language of choice, so they can use their skills to the fullest. SAS Visual Statistics unites all analytical assets.
  • 12
    DataMelt

    DataMelt

    jWork.ORG

    DataMelt (or "DMelt") is an environment for numeric computation, data analysis, data mining, computational statistics, and data visualization. DataMelt can be used to plot functions and data in 2D and 3D, perform statistical tests, data mining, numeric computations, function minimization, linear algebra, solving systems of linear and differential equations. Linear, non-linear and symbolic regression are also available. Neural networks and various data-manipulation methods are integrated using Java API. Elements of symbolic computations using Octave/Matlab scripting are supported. DataMelt is a computational environment for Java platform. It can be used with different programming languages on different operating systems. Unlike other statistical programs, it is not limited to a single programming language. This software combines the world's most-popular enterprise language, Java, with the most popular scripting language used in data science, such as Jython (Python), Groovy, JRuby.
  • 13
    OnPoint CORTEX

    OnPoint CORTEX

    OnPoint - A Koch Engineered Solutions Company

    OnPoint’s CORTEX™ is our advanced analytics platform that leverages historical data along with your process engineers’ expertise to drive profits by increasing operational efficiencies such as increased production and decreased downtime. In comparison to simple regression or statistical approaches, CORTEX combines machine learning with high compute power to enable models to learn from your complex process data. Load your data as-is and CORTEX will clean it. impute missing values and handle categorical variables. Visualize and remove outliers. Add rows and columns to your data and learn what variables are important to the process. CORTEX's proprietary algorithm eliminates the need for you to hunt for the best model. MaGE builds a variety of models in the platform plus an optimized, ensemble model. Then it provides scores for all of them.
  • 14
    LiveLink for MATLAB
    Seamlessly integrate COMSOL Multiphysics® with MATLAB® to extend your modeling with scripting programming in the MATLAB environment. LiveLink™ for MATLAB® allows you to utilize the full power of MATLAB and its toolboxes in preprocessing, model manipulation, and postprocessing. Enhance your in-house MATLAB code with powerful multiphysics simulations. Base your geometry modeling on probabilistic or image data. Use multiphysics models together with Monte Carlo simulations and genetic algorithms. Export COMSOL models on state-space matrix format for incorporating into control systems. Interface in the COMSOL Desktop® environment enables the use of MATLAB® functions while modeling. Manipulate your models from the command line or script to parameterize the geometry, physics, or the solution scheme.
  • 15
    alvaModel

    alvaModel

    Alvascience

    alvaModel is a software tool for building, validating, comparing, and applying QSAR and QSPR models. It supports regression and classification workflows based on molecular descriptors and fingerprints, with a strong focus on model transparency, interpretability, and scientific robustness. The software includes multiple data splitting strategies, variable selection methods, modeling algorithms, and comprehensive internal and external validation procedures. alvaModel provides diagnostic plots, applicability domain analysis, and model comparison tools to support the identification of reliable and predictive models. Designed according to best practices in chemometrics, alvaModel facilitates the development of interpretable models consistent with the OECD principles for QSAR validation, making it suitable for research and regulatory-oriented applications. The graphical interface guides users through the entire modeling workflow while allowing full control over each modeling step.
  • 16
    Autobox

    Autobox

    Automatic Forecasting Systems

    Autobox is simply the easiest way to forecast. Designed with both the novice and expert forecaster in mind you can load your data and forecast like a Pro. No matter what method you currently use to forecast, Autobox will improve your ability to forecast accurately. Autobox won the prestigious “best-dedicated forecasting program” in the Principles of Forecasting textbook and is now a website. AFS’s unique approach doesn’t try to shoehorn the data into a model or a limited number of models, allowing Autobox to combine, history and causal are in an optimal way incorporating when needed level shifts, local time trends, pulses, and seasonal pulses. Autobox discovers new causal variables by gleaning patterns from historical forecast errors and outliers identified by the Autobox engine! Many cases result in causal variables you may not have even known existed. i.e. promotions, holidays, day of the week effects, and many others.
  • 17
    NLREG

    NLREG

    NLREG

    NLREG is a powerful statistical analysis program that performs linear and nonlinear regression analysis, surface and curve fitting. NLREG determines the values of parameters for an equation, whose form you specify, that cause the equation to best fit a set of data values. NLREG can handle linear, polynomial, exponential, logistic, periodic, and general nonlinear functions. Unlike many "nonlinear" regression programs that can only handle a limited set of function forms, NLREG can handle essentially any function whose form you can specify algebraically. NLREG features a full programming language with a syntax similar to C for specifying the function that is to be fitted to the data. This allows you to compute intermediate work variables, use conditionals, and even iterate in loops. With NLREG it is easy to construct piecewise functions that change form over different domains. Since the NLREG language includes arrays, you can even use tabular look-up methods to define the function.
  • 18
    Scilab

    Scilab

    Scilab Enterprises

    Numerical analysis or Scientific computing is the study of approximation techniques for numerically solving mathematical problems. Scilab provides graphics functions to visualize, annotate and export data and offers many ways to create and customize various types of plots and charts. Scilab is a high level programming language for scientific programming. It enables a rapid prototyping of algorithms, without having to deal with the complexity of other more low level programming language such as C and Fortran (memory management, variable definition). This is natively handled by Scilab, which results in a few lines of code for complex mathematical operations, where other languages would require much longer codes. It also comes with advanced data structure such as polynomials, matrices and graphic handles and provides an easily operable development environment.
  • 19
    AForge.NET

    AForge.NET

    AForge.NET

    AForge.NET is an open source C# framework designed for developers and researchers in the fields of Computer Vision and Artificial Intelligence - image processing, neural networks, genetic algorithms, fuzzy logic, machine learning, robotics, etc. The work on the framework's improvement is in constants progress, what means that new feature and namespaces are coming constantly. To get knowledge about its progress you may track source repository's log or visit project discussion group to get the latest information about it. The framework is provided not only with different libraries and their sources, but with many sample applications, which demonstrate the use of this framework, and with documentation help files, which are provided in HTML Help format.
  • 20
    RASON

    RASON

    Frontline Solvers

    RASON (RESTful Analytic Solver Object Notation) is a modeling language and analytics platform embedded in JSON and delivered via a REST API that makes it simple to create, test, solve, and deploy decision services powered by advanced analytic models directly into applications. It lets users define optimization, simulation, forecasting, machine learning, and business rules/decision tables using a high-level language that integrates naturally with JavaScript and RESTful workflows, making analytic models easy to embed into web or mobile apps and scale in the cloud. RASON supports a wide range of analytic capabilities, including linear and mixed-integer optimization, convex and nonlinear programming, Monte Carlo simulation with multiple distributions and stochastic programming methods, and predictive models such as regression, clustering, neural networks, and ensembles, plus DMN-compliant decision tables for business logic.
  • 21
    IBM Datacap
    Streamline the capture, recognition and classification of business documents. IBM® Datacap software is a key capability of the IBM Cloud Pak® for Business Automation. It streamlines the capture, recognition and classification of business documents. Its natural language processing, text analytics and machine learning technologies identify, classify and extract content from unstructured or variable paper documents. Supports multichannel input from scanners, faxes, emails, digital files such as PDF, and images from applications and mobile devices. Uses machine learning to automate the processing of complex or unknown formats and highly variable documents difficult to capture with traditional systems. Enables you to export documents and information to a range of applications and content repositories from IBM and other vendors. Offers configuration of capture workflows and applications using a simple point-and-click interface to speed deployment.
  • 22
    NVIDIA PhysicsNeMo
    NVIDIA PhysicsNeMo is an open source Python deep-learning framework for building, training, fine-tuning, and inferring physics-AI models that combine physics knowledge with data to accelerate simulations, create high-fidelity surrogate models, and enable near-real-time predictions across domains such as computational fluid dynamics, structural mechanics, electromagnetics, weather and climate, and digital twin applications. It provides scalable, GPU-accelerated tools and Python APIs built on PyTorch and released under the Apache 2.0 license, offering curated model architectures including physics-informed neural networks, neural operators, graph neural networks, and generative AI–based approaches so developers can harness physics-driven causality alongside observed data for engineering-grade modeling. PhysicsNeMo includes end-to-end training pipelines from geometry ingestion to differential equations, reference application recipes to jump-start workflows.
  • 23
    ndCurveMaster

    ndCurveMaster

    SigmaLab Tomas Cepowski

    ndCurveMaster is a specialized software designed for multivariable curve fitting. It automatically applies nonlinear regression equations to your datasets, which can consist of observed or measured values. The software supports curve and surface fitting in 2D, 3D, 4D, 5D, ..., nD dimensions. This means that no matter how complex your data is or how many variables it has, ndCurveMaster can handle it with ease. For example, ndCurveMaster can efficiently derive an optimal equation for a dataset with six inputs (x1 to x6) and an output Y, such as: Y = a0 + a1 · exp(x1)^-0.5 + a2 · ln(x2)^8 + ... + a6 · x6^5.2, to accurately match measured values. Utilizing machine learning numerical methods, ndCurveMaster automatically fits the most suitable nonlinear regression functions to your dataset and discovers the relationships between the inputs and output. This robust tool offers linear, polynomial, and nonlinear curve fitting, utilizes crucial validation and goodness-of-fit tests.
  • 24
    camLine Cornerstone
    Cornerstone data analysis software allows efficient work to design experiments and explore data, analyze dependencies, and find answers you can act upon, immediately, interactively, and without any programming. Engineer oriented execution of statistics tasks without being burdened with statistics details. Easy and fast correlation detection in the data even working on Big Data infrastructure. Reduce the amount of experiments via statistically optimized experiment plans and speed up overall development. Fast finding of a usable process model and root-cause analysis via exploratory and visual data analysis. Optimizing your executed experiments via structured planning, data collection, and result analysis. Easy investigations of how noise in the process variables influences the process responses. Automatic capturing compact, reusable workflows.
  • 25
    IMSL

    IMSL

    Perforce

    Enhance performance and save development time with IMSL numerical libraries. Achieve your strategic objectives using IMSL's build tools. Model regression, make decision trees, establish neural networks, and forecast time series with your IMSL library. Rigorously tested and proven for decades across all industries, the IMSL C Numerical Library gives companies a dependable, high-ROI solution for building cutting-edge analytics tools. From data mining and forecasting, to advanced statistical analysis, the IMSL C Numerical Library can help teams quickly add sophisticated functionality to analytic applications. The IMSL C library makes integration and deployment easy. Enjoy easy migrations, support for common platforms and platform combinations, and no added infrastructure on embed in databases or applications.
  • 26
    TradingVisionX

    TradingVisionX

    TradingVisionX

    TradingVisionX offers a suite of institutional-grade automated trading engines (Expert Advisors) built for the MetaTrader (MT5/MT4) ecosystem that use advanced algorithms and neural-adaptive logic to react to market volatility and execute trades with built-in risk management rather than static prediction-based systems. It includes products such as X Fusion AI, a hybrid neural-adaptive engine designed for trend capture and grid recovery with prop-firm-oriented safety controls, TrendMaster FX, a trend-following EA with a decade-plus development history and extensive backtesting, and AI TradingVision GPX, an AI performance engine that has generated verified revenue and combines neural network insights with traditional decision logic. It features real-time adaptability, dynamic stop-loss and entry adjustments based on volatility patterns, hard equity and drawdown limits to help protect funded accounts, optimized execution with low latency on typical VPS setups.
    Starting Price: $399 per month
  • 27
    Neural Designer
    Neural Designer is a powerful software tool for developing and deploying machine learning models. It provides a user-friendly interface that allows users to build, train, and evaluate neural networks without requiring extensive programming knowledge. With a wide range of features and algorithms, Neural Designer simplifies the entire machine learning workflow, from data preprocessing to model optimization. In addition, it supports various data types, including numerical, categorical, and text, making it versatile for domains. Additionally, Neural Designer offers automatic model selection and hyperparameter optimization, enabling users to find the best model for their data with minimal effort. Finally, its intuitive visualizations and comprehensive reports facilitate interpreting and understanding the model's performance.
    Starting Price: $2495/year (per user)
  • 28
    Blue Hexagon

    Blue Hexagon

    Blue Hexagon

    We’ve designed our real-time deep learning platform to deliver speed of detection, efficacy and coverage that sets a new standard for cyber defense. We train our neural networks with global threat data that we’ve curated carefully via threat repositories, dark web, our deployments and from partners. Just like layers of neural networks can recognize your image in photos, our proprietary architecture of neural networks can identify threats in both payloads and headers. Every day, Blue Hexagon Labs validates the accuracy of our models with new threats in the wild. Our neural networks can identify a wide range of threats — file and fileless malware, exploits, C2 communications, malicious domains across Windows, Android, Linux platforms. Deep learning is a subset of machine learning that uses multi-layered artificial neural networks to learn data representation.
  • 29
    VisionPro Deep Learning
    VisionPro Deep Learning is the best-in-class deep learning-based image analysis software designed for factory automation. Its field-tested algorithms are optimized specifically for machine vision, with a graphical user interface that simplifies neural network training without compromising performance. VisionPro Deep Learning solves complex applications that are too challenging for traditional machine vision alone, while providing a consistency and speed that aren’t possible with human inspection. When combined with VisionPro’s rule-based vision libraries, automation engineers can easily choose the best the tool for the task at hand. VisionPro Deep Learning combines a comprehensive machine vision tool library with advanced deep learning tools inside a common development and deployment framework. It simplifies the development of highly variable vision applications.
  • 30
    Maximus

    Maximus

    Sapience

    Maximus is a marketing mix modeling and advanced analytics and measurement platform. The tool is built with R using the latest and most advanced machine learning and statistical functions and methods. Marketing Mix Modeling analyzes historical activities, calculates contribution to sales and ROI per activity and predicts future results. Auto modeling and manual modeling with the help of a recommender based on best-fit tests. Compare between multiple models and easily add or remove variables. Group variables under categories such as media, promotion, and seasonality. Tabulated reports and charts of contribution to sales.
  • 31
    WizWhy

    WizWhy

    WizSoft

    WizWhy determines how the values of one field in the data are affected by the values of other fields. The system performs its analysis based on one field selected by the user as the dependent variable, while the other fields are the independent variables (or conditions). The dependent variable can be analyzed as either Boolean or continuous. The user can fine-tune the analysis by defining parameters such as the minimum probability of the rules, the minimum number of cases in each rule and the cost of a miss vs. the cost of false alarm. WizWhy reveals and lists the rules that relate between the dependent variable and other fields (conditions). The rules are formulated as if-then and if-and-only-if statements. On the basis of the discovered rules WizWhy points out the main patterns, the unexpected rules (interesting phenomena) and the unexpected cases in the data. WizWhy can issue predictions for new cases on the basis of the discovered rules.
  • 32
    Universal Sentence Encoder
    The Universal Sentence Encoder (USE) encodes text into high-dimensional vectors that can be utilized for tasks such as text classification, semantic similarity, and clustering. It offers two model variants: one based on the Transformer architecture and another on Deep Averaging Network (DAN), allowing a balance between accuracy and computational efficiency. The Transformer-based model captures context-sensitive embeddings by processing the entire input sequence simultaneously, while the DAN-based model computes embeddings by averaging word embeddings, followed by a feedforward neural network. These embeddings facilitate efficient semantic similarity calculations and enhance performance on downstream tasks with minimal supervised training data. The USE is accessible via TensorFlow Hub, enabling seamless integration into various applications.
  • 33
    GigaChat

    GigaChat

    Sberbank

    GigaChat knows how to answer user questions, maintain a dialogue, write program code, create texts and pictures based on descriptions within a single context. Unlike a foreign neural network, the GigaChat service initially already supports multimodal interaction and communicates more competently in Russian. The architecture of the GigaChat service is based on the neural network ensemble of the NeONKA (NEural Omnimodal Network with Knowledge-Awareness) model, which includes various neural network models and the method of supervised fine-tuning, reinforcement learning with human feedback. Thanks to this, Sber's new neural network can solve many intellectual tasks: keep up a conversation, write texts, answer factual questions. And the inclusion of the Kandinsky 2.1 model in the ensemble gives the neural network the skill of creating images.
  • 34
    MatConvNet
    The VLFeat open source library implements popular computer vision algorithms specializing in image understanding and local features extraction and matching. Algorithms include Fisher Vector, VLAD, SIFT, MSER, k-means, hierarchical k-means, agglomerative information bottleneck, SLIC superpixels, quick shift superpixels, large scale SVM training, and many others. It is written in C for efficiency and compatibility, with interfaces in MATLAB for ease of use, and detailed documentation throughout. It supports Windows, Mac OS X, and Linux. MatConvNet is a MATLAB toolbox implementing Convolutional Neural Networks (CNNs) for computer vision applications. It is simple, efficient, and can run and learn state-of-the-art CNNs. Many pre-trained CNNs for image classification, segmentation, face recognition, and text detection are available.
  • 35
    Neuri

    Neuri

    Neuri

    We conduct and implement cutting-edge research on artificial intelligence to create real advantage in financial investment. Illuminating the financial market with ground-breaking neuro-prediction. We combine novel deep reinforcement learning algorithms and graph-based learning with artificial neural networks for modeling and predicting time series. Neuri strives to generate synthetic data emulating the global financial markets, testing it with complex simulations of trading behavior. We bet on the future of quantum optimization in enabling our simulations to surpass the limits of classical supercomputing. Financial markets are highly fluid, with dynamics evolving over time. As such we build AI algorithms that adapt and learn continuously, in order to uncover the connections between different financial assets, classes and markets. The application of neuroscience-inspired models, quantum algorithms and machine learning to systematic trading at this point is underexplored.
  • 36
    Supervisely

    Supervisely

    Supervisely

    The leading platform for entire computer vision lifecycle. Iterate from image annotation to accurate neural networks 10x faster. With our best-in-class data labeling tools transform your images / videos / 3d point cloud into high-quality training data. Train your models, track experiments, visualize and continuously improve model predictions, build custom solution within the single environment. Our self-hosted solution guaranties data privacy, powerful customization capabilities, and easy integration into your technology stack. A turnkey solution for Computer Vision: multi-format data annotation & management, quality control at scale and neural networks training in end-to-end platform. Inspired by professional video editing software, created by data scientists for data scientists — the most powerful video labeling tool for machine learning and more.
  • 37
    FunnelTap

    FunnelTap

    SaaSMQL

    FunnelTap is a platform that allows you to track and forecast marketing and sales funnels. You can create campaign scenarios and adjust variables to see how your funnel would be impacted. For example, if you just spent $20,000 on Google Ads and generated 45 leads and 5 new customers, what would happen if your cost-per-click (CPC) would increase from $6 to $9? What if instead of $20,000, next quarter you spend $75,000? What if your deal size increases? With FunnelTap you can save your hypothetical scenarios and later compare them with the actual campaigns. You can't improve what you are not tracking. Use FunnelTap to map your funnel conversion rates, and create forecasts based on different variables. Create different scenarios by modeling budget, conversion rates, cost-per-click, and customer value. Keep all your crucial metrics always in front of you. Play with your funnel variables to visualize worst and best-case scenarios.
    Starting Price: $39 per month
  • 38
    WineEngine
    WineEngine is powered by TinEye's unparalleled image recognition technology and has been engineered and optimized to work with photographs captured by users' smart devices. This service uses exceptional image recognition algorithms and neural networks to deal with the common problems encountered in user-supplied photographs: low resolution, bad lighting and color, improper framing and cropping, off-centre angles and blurriness. WineEngine has also been specially engineered to recognize wine vintages when available on a label. High success rate even with low-quality label images. Automatically locates and focuses on the label region within an image. Outperforms OCR-based attempts to read labels. Searches in real-time, even for multi-million wine label collections. WineEngine combines TinEye’s state-of-the-art image recognition algorithms with neural networks to provide fast and reliable recognition of wine, beer and spirit labels.
    Starting Price: $200/month
  • 39
    Torch

    Torch

    Torch

    Torch is a scientific computing framework with wide support for machine learning algorithms that puts GPUs first. It is easy to use and efficient, thanks to an easy and fast scripting language, LuaJIT, and an underlying C/CUDA implementation. The goal of Torch is to have maximum flexibility and speed in building your scientific algorithms while making the process extremely simple. Torch comes with a large ecosystem of community-driven packages in machine learning, computer vision, signal processing, parallel processing, image, video, audio and networking among others, and builds on top of the Lua community. At the heart of Torch are the popular neural network and optimization libraries which are simple to use, while having maximum flexibility in implementing complex neural network topologies. You can build arbitrary graphs of neural networks, and parallelize them over CPUs and GPUs in an efficient manner.
  • 40
    Zebra by Mipsology
    Zebra by Mipsology is the ideal Deep Learning compute engine for neural network inference. Zebra seamlessly replaces or complements CPUs/GPUs, allowing any neural network to compute faster, with lower power consumption, at a lower cost. Zebra deploys swiftly, seamlessly, and painlessly without knowledge of underlying hardware technology, use of specific compilation tools, or changes to the neural network, the training, the framework, and the application. Zebra computes neural networks at world-class speed, setting a new standard for performance. Zebra runs on highest-throughput boards all the way to the smallest boards. The scaling provides the required throughput, in data centers, at the edge, or in the cloud. Zebra accelerates any neural network, including user-defined neural networks. Zebra processes the same CPU/GPU-based trained neural network with the same accuracy without any change.
  • 41
    Nixtla

    Nixtla

    Nixtla

    Nixtla is a platform for time-series forecasting and anomaly detection built around its flagship model TimeGPT, described as the first generative AI foundation model for time-series data. It was trained on over 100 billion data points spanning domains such as retail, energy, finance, IoT, healthcare, weather, web traffic, and more, allowing it to make accurate zero-shot predictions across a wide variety of use cases. With just a few lines of code (e.g., via their Python SDK), users can supply historical data and immediately generate forecasts or detect anomalies, even for irregular or sparse time series, and without needing to build or train models from scratch. TimeGPT supports advanced features like handling exogenous variables (e.g., events, prices), forecasting multiple time-series at once, custom loss functions, cross-validation, prediction intervals, and model fine-tuning on bespoke datasets.
  • 42
    XLMiner

    XLMiner

    Frontline Systems

    XLMiner® Platform is now named Analytic Solver® Data Mining. It's our easy-to-use, highest-capacity tool for data visualization, forecasting and data mining in Excel. It enables you to explore, visualize and transform your data in Excel, apply both classical statistics and modern data mining methods such as classification and regression trees and neural networks, and easily apply the most popular time series methods for forecasting. It can sample data from virtually any database, including Microsoft's Power Pivot in-memory database handling 100 million rows or more, clean and transform your data, and partition data into training, validation, and test datasets. Its performance and capacity rivals that of "enterprise" data mining software costing ten times its price. Besides the latest enhancements to XLMiner Platform's features and performance, you get more with Analytic Solver Data Mining, including free access to our cloud version, and free use of our optimization, simulation, etc.
    Starting Price: $2495 one-time payment
  • 43
    Simulation Master

    Simulation Master

    Vortarus Technologies

    Simulation Master runs 100% inside Excel in an environment with which you are already comfortable. All reports and charts are Excel native objects allowing you to edit them for communication with colleagues. All simulation functions can be entered directly in the worksheet. If you’re not sure how to enter them, Simulation Master has assistance tools to enter random variables, time series, decision variables, and copulas for you. If you have a decision tree created with DTace, instead of discrete outcomes, you can simulate chance node outcomes with Simulation Master. If you have a machine learning model created with Vaimal, you can simulate the model with Simulation Master. This is a powerful combination that allows data models to be used with analytic simulation models.
    Starting Price: $199 one-time payment
  • 44
    QMSys GUM

    QMSys GUM

    Qualisyst

    The QMSys GUM Software is suitable for the analysis of the uncertainty of physical measurements, chemical analyses and calibrations. The software uses three different methods to calculate the measurement uncertainty. GUF Method for linear models, this method is applied to linear and quasi-linear models and corresponds to the GUM Uncertainty Framework. The software calculates the partial derivatives (the first term of a Taylor series) to determine the sensitivity coefficients of the equivalent linear model and then calculates the combined standard uncertainty in accordance with the Gaussian error propagation law. GUF Method for nonlinear models, this method is provided for nonlinear models with the symmetric distribution of the result quantities. In this method, a series of numerical methods are used, e.g. nonlinear sensitivity analysis, second and third-order sensitivity indices, quasi-Monte Carlo with Sobol sequences.
  • 45
    Salford Predictive Modeler (SPM)
    The Salford Predictive Modeler® (SPM) software suite is a highly accurate and ultra-fast platform for developing predictive, descriptive, and analytical models. The Salford Predictive Modeler® software suite includes the CART®, MARS®, TreeNet®, Random Forests® engines, as well as powerful new automation and modeling capabilities not found elsewhere. The SPM software suite’s data mining technologies span classification, regression, survival analysis, missing value analysis, data binning and clustering/segmentation. SPM algorithms are considered to be essential in sophisticated data science circles. The SPM software suite‘s automation accelerates the process of model building by conducting substantial portions of the model exploration and refinement process for the analyst. We package a complete set of results from alternative modeling strategies for easy review.
  • 46
    ApiScout

    ApiScout

    ApiScout

    ApiScout is your one-stop environment for building, testing, and describing Rest Apis. It is incredibly fast. No waiting or restarting. No spinners. Fast on all Mac devices. Compose requests, inspect responses ApiScout is the only HTTP Client tool you will need while building, testing, and describing your APIs. Organize requests into folders Dynamic values Use dynamically calculated values, values from previous responses, environment variables, computed hashes, etc. in every part of your requests. Environment variables Use environments to group related sets of values together. It's very handy for switching user accounts, servers, or anything else. Define variables like tokens, server base URLs, or credentials and re-use them globally for a seamless development/production workflow.
    Starting Price: $5 per month
  • 47
    Develve

    Develve

    Develve Statistical Software

    Statistical software for fast and easy interpretation of experimental data in science and R&D in a technical environment. This statistical package helps with analysis and prevents making false assumptions. In short it makes statistics faster and easier, suitable for fewer experience users but advanced enough for more demanding users. Develve has no deep hidden menus, everything is directly accessible and the results are directly visible, to improve productivity. For instance, the result graphs are easily scrollable and with a click on a graph a bigger version will pop up. Develve clearly indicates when two variables are significantly different, and if the sample size is big enough to prevent false assumptions. For a Design of Experiments Develve helps to create a test matrix. When a factor is not in balance Develve will detect this. This program can help to develop a robust product with high quality, this makes Develve an excellent Six sigma toolbox.
    Starting Price: $75 one-time payment
  • 48
    TECHBASE Oil and Gas

    TECHBASE Oil and Gas

    TECHBASE International

    The TECHBASE Oil and Gas Package plots downhole data and maps, and stores your data in a TECHBASE database. Composite and locates your data within boundaries, then you can choose between 7 modeling algorithms to interpret your surface and subsurface data. Composite (average) downhole data into intervals, rock types, benches or user-defined variables. Choose from 7 modeling algorithms (inverse distance, kriging, 2D polygon estimator, 2D trend surface estimator, 2D triangulation estimator, 2D minimum curvature, 2D contour to grid interpolation) plus variography. Populate 3D block tables or 2D cell tables using TECHBASE modeling algorithms. Locate data within boundaries such as land ownership limits. Composite (average) downhole data into intervals, rock types, benches or user-defined variables. Convert data from one coordinate system to another "on the fly' while creating TECHBASE graphics.
  • 49
    Spherexx Optimize
    Spherexx Optimize is an AI-powered revenue management platform tailored for multifamily properties. It enhances operational efficiency by providing real-time occupancy forecasting, predictive analysis for leases and renewals, and automated pricing models based on various variables like competitor data and property type. The platform integrates seamlessly with top property management software, and includes powerful features like customizable dashboards, pricing simulations, and renovation valuation tools. With built-in reporting and competitor analysis, Optimize helps property managers make data-driven decisions to maximize asset value and drive profitability.
  • 50
    Fido

    Fido

    Fido

    Fido is a light-weight, open-source, and highly modular C++ machine learning library. The library is targeted towards embedded electronics and robotics. Fido includes implementations of trainable neural networks, reinforcement learning methods, genetic algorithms, and a full-fledged robotic simulator. Fido also comes packaged with a human-trainable robot control system as described in Truell and Gruenstein. While the simulator is not in the most recent release, it can be found for experimentation on the simulator branch.