File Coverage

blib/lib/Paws/MachineLearning/PerformanceMetrics.pm
Criterion Covered Total %
statement 3 3 100.0
branch n/a
condition n/a
subroutine 1 1 100.0
pod n/a
total 4 4 100.0


line stmt bran cond sub pod time code
1             package Paws::MachineLearning::PerformanceMetrics;
2 1     1   538 use Moose;
  1         2  
  1         8  
3             has Properties => (is => 'ro', isa => 'Paws::MachineLearning::PerformanceMetricsProperties');
4             1;
5              
6             ### main pod documentation begin ###
7              
8             =head1 NAME
9              
10             Paws::MachineLearning::PerformanceMetrics
11              
12             =head1 USAGE
13              
14             This class represents one of two things:
15              
16             =head3 Arguments in a call to a service
17              
18             Use the attributes of this class as arguments to methods. You shouldn't make instances of this class.
19             Each attribute should be used as a named argument in the calls that expect this type of object.
20              
21             As an example, if Att1 is expected to be a Paws::MachineLearning::PerformanceMetrics object:
22              
23             $service_obj->Method(Att1 => { Properties => $value, ..., Properties => $value });
24              
25             =head3 Results returned from an API call
26              
27             Use accessors for each attribute. If Att1 is expected to be an Paws::MachineLearning::PerformanceMetrics object:
28              
29             $result = $service_obj->Method(...);
30             $result->Att1->Properties
31              
32             =head1 DESCRIPTION
33              
34             Measurements of how well the C<MLModel> performed on known
35             observations. One of the following metrics is returned, based on the
36             type of the C<MLModel>:
37              
38             =over
39              
40             =item *
41              
42             BinaryAUC: The binary C<MLModel> uses the Area Under the Curve (AUC)
43             technique to measure performance.
44              
45             =item *
46              
47             RegressionRMSE: The regression C<MLModel> uses the Root Mean Square
48             Error (RMSE) technique to measure performance. RMSE measures the
49             difference between predicted and actual values for a single variable.
50              
51             =item *
52              
53             MulticlassAvgFScore: The multiclass C<MLModel> uses the F1 score
54             technique to measure performance.
55              
56             =back
57              
58             For more information about performance metrics, please see the Amazon
59             Machine Learning Developer Guide.
60              
61             =head1 ATTRIBUTES
62              
63              
64             =head2 Properties => L<Paws::MachineLearning::PerformanceMetricsProperties>
65              
66            
67              
68              
69              
70             =head1 SEE ALSO
71              
72             This class forms part of L<Paws>, describing an object used in L<Paws::MachineLearning>
73              
74             =head1 BUGS and CONTRIBUTIONS
75              
76             The source code is located here: https://github.com/pplu/aws-sdk-perl
77              
78             Please report bugs to: https://github.com/pplu/aws-sdk-perl/issues
79              
80             =cut
81