Fair Use Rules: http://www.spec.org/fairuse.html
************************************************
SPEC Fair Use Rules
Updated 9 February 2023
Change history
Introduction
Consistency and fairness are guiding principles for SPEC. To help assure that these principles are met, the following requirements must be met by any organization or individual who makes public use of SPEC benchmark results.
Section I lists general requirements that apply to public use of all SPEC benchmarks. Section II lists additional specific requirements for individual benchmarks.
It is intended that this document provides the information needed for compliance with Fair Use, and in the event of any inconsistencies, this document takes precedence over individual benchmark run rules fair use requirements.
I. General Requirements For Public Use of All SPEC Benchmark Results
I.A. Requirements List
Compliance. Claimed results must be compliant with that benchmark's rules. See definition: compliant result. (Certain Exceptions may apply.)
Data Sources
Source(s) must be stated for quoted SPEC results.
Such sources must be publicly available, from SPEC or elsewhere.
The licensee (the entity responsible for the result) must be clearly identifiable from the source.
The date that the data was retrieved must be stated.
The SPEC web site (http://www.spec.org) or a suitable sub page must be noted as a resource for additional information about the benchmark.
Clear and correct, as of a specific date
Statements regarding SPEC, its benchmarks, and results published by SPEC, must be clear and correct.
A claim must state a date as of which data was retrieved.
A claim may compare newly announced compliant results vs. data retrieved earlier.
There is no requirement to update a claim when later results are published.
For example, an Acme web page dated 28 January 2011 announces performance results for the Model A and claims "the best SPECweb® 2009 benchmark performance when compared vs. results published at www.spec.org as of 26 January 2011". If SPEC publishes better results on 1 February, there is no requirement to update the page.
Trademarks.
Reference must be made to the SPEC trademark. Such reference may be included in a notes section with other trademark references (SPEC trademarks are listed at http://www.spec.org/spec/trademarks.html).
SPEC's trademarks may not be used to mislabel something that is not a SPEC metric.
For example, suppose that a Gaming Society compares performance using a composite of a weighted subset of the SPEC CPU 2006 benchmark plus a weighted subset of the SPECviewperf 11 benchmark, and calls its composite "GamePerfMark". The composite, weighting, and subsetting are done by the Society, not by SPEC. The composite may be useful and interesting, but it may not be represented as a SPEC metric. It would be a Fair Use violation to reference it as "SPECgame".
Required Metrics. In the tables below, some benchmarks have Required Metrics. Public statements must include these.
Comparisons. It is fair to compare compliant results to other compliant results. Enabling such comparisons is a core reason why the SPEC benchmarks exist. Each benchmark product has workloads, software tools, run rules, and review processes that are intended to improve the technical credibility and relevance of such comparisons.
When comparisons are made,
SPEC metrics may be compared only to SPEC metrics.
The basis for comparison must be stated.
Results of one benchmark are not allowed to be compared to a different benchmark
(e.g. SPECjAppServer2004 to TPC-C; or SPECvirt_sc2010 to SPECweb2005).
Results of a benchmark may not be compared to a different major release of the same benchmark
(e.g. SPECweb2005 to SPECweb2009). Exception: normalized historical comparisons may be made as described under Retired Benchmarks.
Comparisons of non-compliant numbers. The comparison of non-compliant numbers to compliant results is restricted to certain exceptional cases described later in this Fair Use Rule (Academic/Research usage; Estimates, for those benchmarks that allow estimates; Normalized Historical Comparisons). Where allowed, comparisons that include non-compliant numbers must not be misleading or deceptive as to compliance. It must be clear from the context of the comparison which numbers are compliant and which are not.
Back to Contents
I.B. Generic Example
This example for a generic SPEC benchmark illustrates the points above. See also the examples for specific benchmarks below, for additional requirements that may apply.
Example: New York, NY, January 28, 2011: Acme Corporation announces that the Model A achieves 100 for SPECgeneric2011, a new record among systems running Linux [1].
[1] Comparison based on best performing systems using the Linux operating system published at www.spec.org as of 26 January 2011. SPEC® and the benchmark name SPECgeneric® are registered trademarks of the Standard Performance Evaluation Corporation. For more information about SPECgeneric2011, see www.spec.org/generic2011/.
Back to Contents
I.C. Compliance Exceptions
Exceptions regarding the compliance requirement are described in this section.
Academic/research usage. SPEC encourages use of its benchmarks in research and academic contexts, on the grounds that SPEC benchmarks represent important characteristics of real world applications and therefore research innovations measured with SPEC benchmarks may benefit real users. SPEC understands that academic use of the SPEC benchmarks may be seen as enhancing the credibility of both the researcher and SPEC.
Research use of SPEC benchmarks may not be able to meet the compliance requirement.
Examples: (1) Testing is done with a simulator rather than real hardware. (2) The software innovation is not generally available or is not of product quality. (3) The SPEC test harness is modified without approval of SPEC.
SPEC has an interest in protecting the integrity of the SPEC metrics, including consistency of methods of measurement and the meaning of the units of measure that are defined by SPEC benchmarks. It would be unfair to those who do meet the compliance requirements if non-compliant numbers were misrepresented as compliant results. Therefore, SPEC recommends that researchers consider using the SPEC workload, but do not call the measurements by the SPEC metric name.
The requirements for Fair Use in academic/research contexts are:
It is a Fair Use violation to imply, to the reasonable reader, that a non-compliant number is a compliant result.
Non-compliance must be clearly disclosed. If the SPEC metric name is used, it is recommended that (nc), for non-compliant, be added after each mention of the metric name. It is understood that there may be other ways to accomplish this in context, for example adding words such as "experimental" or "simulated" or "estimated" or "non-compliant".
Diagrams, Tables, and Abstracts (which, often, are excerpted and used separately) must have sufficient context on their own so that they are not misleading as to compliance.
If non-compliant numbers are compared to compliant results it must be clear from the context which is which.
Example: The Acme Corporation Model A achieves SPECint®2006 100 in testing published at www.spec.org. Our Research Compiler improves the same hardware to SPECint®2006 125(nc). The notation (nc), for non-compliant, is used because our compiler does not meet SPEC's requirements for general availability.
Other Fair Use Requirements Still Apply. This section discusses an exception to only the compliance requirement from the Requirements List. Fair Use in academic/research context must still meet the other requirements, including but not limited to making correct use of SPEC results with dated citations of sources.
Estimates. Some SPEC benchmarks allow estimates, as shown in the tables below. Only for those benchmarks, it is acceptable to compare estimates to compliant results provided that:
Estimates must be clearly identified as such.
Each use of a SPEC metric as an estimate must be clearly marked as an estimate.
If estimates are used in graphs, the word "estimated" or "est." must be plainly visible within the graph, for example in the title, the scale, the legend, or next to each individual number that is estimated.
Licensees are encouraged to give a rationale or methodology for any estimates, together with other information that may help the reader assess the accuracy of the estimate.
Example 1: The Acme Corporation Model A achieves SPECint®2006 100 in testing published at www.spec.org. The Bugle Corporation Model B will nearly double that performance to SPECint®2006 198(est). The notation (est), for estimated, is used because SPECint®2006 was run on pre-production hardware. Customer systems, planned for Q4, are expected to be similar.
Example 2: Performance estimates are modeled using the cycle simulator GrokSim Mark IV. It is likely that actual hardware, if built, would include significant differences.
Back to Contents
I.D. Derived Values
It is sometimes useful to define a numeric unit that includes a SPEC metric plus other information, and then use the new number to compare systems. This is called a Derived Value.
Examples: SPECint®_rate2006 per chip
SPECvirt_sc2010 per gigabyte
Note: the examples above are not intended to imply that all derived values use ratios of the form above. The definition is intentionally broad, and includes additional examples
Derived values are acceptable, provided that they follow this Fair Use rule, including but not limited to using compliant results, listing sources for SPEC result data, and including any required metrics.
A derived value must not be represented as a SPEC metric. The context must not give the appearance that SPEC has created or endorsed the derived value. In particular, it is a Fair Use violation, and may be a Trademark violation, to form a new word that looks like a SPEC metric name when there is no such metric.
Not Acceptable: SPECint®_chiprate2006
SPECvirt_sc2010gigs
If a derived value is used as the basis of an estimate, the estimate must be correctly labeled. A derived value may introduce seeming opportunities to extrapolate beyond measured data. For example, if 4 different systems all have the same ratio of SPECwhatever per chip, it can be tempting to estimate that another, unmeasured, system will have the same ratio. This may be a very good estimate; but it is still an estimate, and must be correctly labeled. If used in public, it must be for a benchmark that allows estimates.
Back to Contents
I.E. Non-SPEC Information
A basis of comparison or a derived value may use information from both SPEC and non-SPEC sources.
SPEC values truthfulness and clarity at all times:
When information from SPEC sources is used in public, SPEC requires that such information be reported correctly (per section I.A.3).
SPEC recommends that non-SPEC material should be accurate, relevant, and not misleading. Data and methods should be explained and substantiated.
Disclaimer. SPEC is not responsible for non-SPEC information. The SPEC Fair Use rule is limited to the information derived from SPEC sources. (Other rules may apply to the non-SPEC information, such as industry business standards, ethics, or Truth in Advertising law.)
SPEC may point out non-SPEC content. SPEC reserves the right to publicly comment to distinguish SPEC information from non-SPEC information.
Integrity of results and trademarks. The non-SPEC information must not be presented in a manner that may reasonably lead the reader to untrue conclusions about SPEC, its results, or its trademarks.
Examples
Example 1 (basis): ACME Corporation claims the best SPECjEnterprise 2010 benchmark performance for systems available as (example 1a) rack mount, or (1b) with more than 8 disk device slots, or (1c) with Art Deco paint. Bugle Corporation asserts that the basis of comparison is irrelevant or confusing or silly. Bugle may be correct. Nevertheless, such irrelevance, confusion, or silliness would not alone be enough to constitute a SPEC Fair Use violation.
Example 2 (derived value): ACME claims that its model A has better SPECint®_rate2006 per unit of cooling requirement than does the Bugle Model B. SPEC is not responsible for judging thermal characteristics.
Example 3: ACME claims the "best SPECmpi®M_2007 performance among industry-leading servers". This claim violates the requirement that the basis must be clear.
Example 4: ACME computes SPECint®_rate2006 per unit of cooling, but inexplicably selects SPECint®_rate_base2006 for some systems and SPECint®_rate2006 for others. The computation violates the requirement that the SPEC information must be accurate, and may also violate the requirement that a claim should not lead the reasonable reader to untrue conclusions about SPEC's results.
Back to Contents
I.F. Retired Benchmarks
Disclosure. If public claims are made using a retired benchmark, with compliant results that have not been previously reviewed and accepted by SPEC, then the fact that the benchmark has been retired and new results are no longer being accepted for review and publication by SPEC must be plainly disclosed.
Example: The Acme Corporation Model A achieves a score of 527 SPECjvm98. Note: SPECjvm98 has been retired and SPEC is no longer reviewing or publishing results with that benchmark. We are providing this result as a comparison to older hardware that may still be in use at some customer sites.
Benchmarks that require review. Some benchmarks require that SPEC review and accept results prior to public use. For such benchmarks, the review process is not available after benchmark retirement, and therefore no new results may be published.
Normalized historical comparisons. When SPEC releases a new major version of a benchmark, the SPEC metrics are generally not comparable to the previous version, and there is no formula for converting from one to the other. Nevertheless, SPEC recognizes that there is value in historical comparisons, which are typically done by normalizing performance across current and one or more generations of retired benchmarks, using systems that have been measured with both the older and newer benchmarks as the bridges for the normalization. Historical comparisons are inherently approximate because picking differing 'bridge' systems may yield differing ratios and because an older workload exercises different system capabilities than a more modern workload.
Normalized historical comparisons are acceptable only if their inherently approximate nature is not misrepresented. At minimum:
It must not be claimed that SPEC metrics for one benchmark generation are precisely comparable to metrics from another generation.
The approximate nature must be apparent from the context.
For example, a graph shown briefly in a presentation is labelled "Normalized Historic Trends for SPEC<benchmark>". As another example, in a white paper (where the expectation is for greater detail than presentations), the author explicitly calls out that workloads have differed over time, and explains how numbers are calculated.
================================================================================================
II. Requirements for Public Use of Individual Benchmark Results
For further detail about the meaning of SPEC metrics, the individual benchmark run rules may be consulted. The benchmark names at the top of each table are links to that benchmark's run rules.
SPEC CPU® 2017 benchmark
SPEC.org Submission
Requirements None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently.
SPEC Metrics
The geometric means for overall performance
Performance Geometric Means Energy Geometric Means
SPECspeed® SPECrate® SPECspeed® SPECrate®
SPECspeed®2017_int_base
SPECspeed®2017_int_peak
SPECspeed®2017_fp_base
SPECspeed®2017_fp_peak SPECrate®2017_int_base
SPECrate®2017_int_peak
SPECrate®2017_fp_base
SPECrate®2017_fp_peak SPECspeed®2017_int_energy_base
SPECspeed®2017_int_energy_peak
SPECspeed®2017_fp_energy_base
SPECspeed®2017_fp_energy_peak SPECrate®2017_int_energy_base
SPECrate®2017_int_energy_peak
SPECrate®2017_fp_energy_base
SPECrate®2017_fp_energy_peak
Individual benchmark SPECratios
Individual benchmark Energy Ratios
Individual benchmark run times in seconds
Individual benchmark energy consumption in kilojoules
Overall system maximum power consumption in watts
Overall system idle power consumption in watts
Required Metrics The baseline performance metric for whatever suite is reported.
Conditionally
Required Metrics
Condition Requirements
Comparison of idle power The overall maximum power, baseline Performance Geometric Mean, and baseline Energy Geometric Mean shall be disclosed in close proximity to the idle power.
Use of Estimates
Estimates are not allowed for any of the SPEC CPU 2017 energy metrics (neither the Energy Geometric Means nor the individual benchmark energy ratios). All public use of SPEC CPU 2017 energy metrics must be from rule-compliant results.
SPEC CPU 2017 performance metrics (Performance Geometric Means, Individual benchmark SPECratios, Individual benchmark run times in seconds) may be estimated, provided that they are clearly identified as estimates.
It is permitted to estimate a peak performance metric without providing a corresponding base performance estimate.
Disallowed
Comparisons Energy metrics generated with releases prior to SPEC CPU 2017 v1.1 are not comparable.
================================================================================================
III. Definitions
Basis for Comparison Information from a compliant result may be used to define a basis for comparing a subset of systems, including but not limited to memory size, number of CPU chips, operating system version, other software versions, or optimizations used. Other information, not derived from SPEC, may also be used to define a basis, for example, cost, size, cooling requirements, or other system characteristics. The basis must be clearly disclosed.
By Location
For benchmarks designated as having a submission requirement "By location", these requirements apply:
Each licensee test location (city, state/province and country) must measure and submit a single compliant result for review, and have that result accepted by the technically relevant subcommittee, before publicly disclosing or representing as compliant any result for the benchmark.
After acceptance of a compliant result from a test location, the licensee may publicly disclose future compliant results produced at that location without prior acceptance by the subcommittee.
The intent of this requirement is that the licensee test location demonstrates the ability to produce a compliant result.
Note that acceptance of a result for one SPEC benchmark does not relieve a licensee of the requirement to complete the procedure for any other SPEC benchmark(s) that also require initial acceptance by location.
Close Proximity In the same paragraph or an adjacent paragraph for written materials; or visible simultaneously for visual materials. The font must be legible to the intended audience.
Compliant Result
(i) The set of measurements, logs, full disclosure report pages, and other artifacts that are the output of a process that follows the run and reporting rules of a SPEC benchmark. Depending on the benchmark and its rules, the process may have many steps and many ingredients, such as specific software, hardware, tuning, documentation, availability of support, and timeliness of shipment. To find the rules for a specific benchmark, click its name in the tables above.
(ii) A number within such set that is labelled as a SPEC metric.
Note that benchmark reporting pages include other types of information, such as the amount of memory on the system. It is not allowed to represent such other information as a SPEC metric, although it may be used to define a Basis for Comparison.
SPEC reviews results prior to publication on its web site, but the accuracy and compliance of the submission remains the responsibility of the benchmark licensee. See the disclaimer.
Derived Value
A unit that is a numerical function of one or more SPEC Metrics, rather than the original metric. The function may be a constant divisor, to normalize performance to a comparison system of interest. The function may bring in quantities that are some other characteristic(s) of the system. Such other characteristics may include information from both SPEC result pages and from non-SPEC sources.
Examples: "SPECint®_rate2006 per chip" (metric is divided by number of chips reported on SPEC disclosure)
"Cubic feet per SPECint®_rate2006" (a non-SPEC quantity is divided by the metric)
"Normalized SPECsfs2008_cifs" (metric is divided by result for a comparison system)
"GamePerfMark", from the trademark section above.
This definition is intentionally broad, encompassing any function that includes a SPEC metric as one of the inputs.
Disallowed Comparisons As mentioned above, results of one benchmark may not be compared to a different benchmark, nor to a different major release of the same benchmark. Individual benchmarks may forbid other comparisons, typically where such comparisons are considered inherently misleading.
Estimate
An estimate is an alleged value for a SPEC metric that was not produced by a run rule compliant test measurement.
For purposes of this definition, it does not matter whether the alleged value for the metric was produced by extrapolating from known systems, or by cycle accurate simulation, or by whiteboard or dartboard, or by normal testing with the exception of a single missing mandatory requirement (e.g. the 3 month availability window). If the alleged value is not from a rule-compliant run, then it is an estimate.
The usage of estimates is limited.
Major Release
For purposes of this fair use rule, the term "major release" references a change in the year component of a benchmark product name, for example SPECjvm98 vs. SPECjvm2008.
Non-Compliant Number
A value for a SPEC metric that fails to meet all the conditions for a compliant result.
Usage Note: By the definition of Estimate, above, a non-compliant number is also an estimate; and, of course, an estimate does not comply with the run rules. Therefore, the terms are sometimes interchangeable. In practical usage, an estimate may bear no relationship to any measurement activity; whereas a non-compliant number is typically the product of running the SPEC-supplied tools in a manner that does not comply with the run rules. In such cases, the tools may print out numbers that are labelled with SPEC metric units, but the values that are printed are not compliant results. Such values are sometimes informally called "non-compliant results", but for the sake of clarity, this document prefers the term "non-compliant number".
Required Metric A SPEC metric whose value must be supplied. Individual benchmark sections above list whether they have required metrics. If so, then when any data is used from a full disclosure report, then the values for this/these metric(s) must also be used.
SPEC Metric
(i) A unit of measurement defined by a benchmark, such as response time or throughput for a defined set of operations. The available units for each benchmark are named in the tables above, and are defined within the benchmark run rules (which can be found by clicking the benchmark name in the tables above).
Example: SPECjvm2008 Peak ops/m.
(ii) A specific value measured by such a unit.
Example: 320.52 SPECjvm2008 Peak ops/m.
Usage Note: Both senses are used in this document, and it is expected that the sense is clear from context. For example, the prohibition against calling a derived value by a SPEC metric name is sense (i): do not define your own unit of measurement and then apply SPEC's trademarks to that unit. As another example, the rules for SPECpower_ssj2008 require disclosure of SPECpower_ssj2008 overall ssj_ops/watt, which is sense (ii): one is required to supply the value measured for a particular system.
A printed SPEC metric value is not necessarily a Compliant Result: SPEC provides tools that display values for SPEC metrics, such as the above example of "320.52 SPECjvm2008 Peak ops/m". Although SPEC's tools help to enforce benchmark run rules, they do not and cannot automatically enforce all rules. Prior to public use, the licensee remains responsible to ensure that all requirements for a compliant result are met. If the requirements are not met, then any printed values for the metrics are non-compliant numbers.
================================================================================================
IV. Violations Determination, Penalties, and Remedies
SPEC has a process for determining fair use violations and appropriate penalties and remedies that may be assessed.
================================================================================================
************************************************************************************************
================================================================================================
TradeMark: http://www.spec.org/spec/trademarks.html
****************************************************
SPEC® Trademarks And Service Marks
The following list of trademarks and service marks shows the proper designation for each mark of SPEC, the Standard Performance Evaluation Corporation. This list is applicable worldwide; however, in certain countries variations on designations have been used. Please contact SPEC if you have questions.
ADASMark ™ MLMARK ® SPEC Cloud ® SPECjvm ® SPEC SIP_Infrastructure ®
AndEBench ™ MultiBench ™ SPEC CPU ® SPECmail ® SPECspeed ®
AudioMark ™ OABench ™ SPECENV ™ SPEC Mark ™ SPECstorage ®
AutoBench ™ OMPM ™ SPECfp ® SPEC MPI ® SPECviewperf ®
BROWSINGBENCH ® PTDaemon ® SPECglperf ™ SPEC OMP ® SPEC VIRT ®
Chauffeur ® SECUREMARK ® SPECgpc ® SPEC OMPL ™ SPEC VIRT_SC ®
COREMARK ® SERT ® SPEChpc ™ SPECpower ® SPECweb ®
DENBench ™ SPEC ® SPECint ® SPECpower_ssj ® SPECworkstation ®
EEMBC ® SPEC ACCEL ® SPECjAppServer ® SPECrate ® SPECwpc ®
ENERGYRUNNER ® SPECapc ® SPECjbb ® SPEC SDM ™ TeleBench ™
FPMark ™ SPECchem ™ SPECjEnterprise ® SPECseis ™ ULPMARK ®
IOTMARK ® SPECclimate ® SPEC JMS ® SPEC SFS ®
SPEC logo
[ ® = Registered Trademark, ™ = Trademark, SM = Service Mark ]
Use of SPEC Trademarks and Service Marks
A trademark or service mark is a word or symbol used in commerce on goods or services to identify the source of those goods or services.
All SPEC marks must be properly designated with the appropriate symbol (®, ™ or SM). SPEC's marks may not be used by others except in a factual and non-trademark manner in accordance with the following policy:
Persons who have used or are using SPEC's goods and services to obtain objective performance evaluations for their computer products may express that fact and may use the related SPEC marks in their expressions of that fact. SPEC's marks may also be used in media descriptions or reviews of SPEC's goods and services.
Example usages:
The SPEC Cloud® IaaS 2016 benchmark is SPEC's first benchmark suite to measure cloud performance.
The SPECapc® for Maya® 2017 benchmark benchmark is all-new new performance evaluation software for systems running Autodesk Maya 2017 3D animation software.
The Turboblaster 9000 was tested with the SPEC CPU® 2017 benchmark suite and produced a SPECint® 2017 int_base result of 9.30.
The SPEC ACCEL® benchmark suite tests performance with computationally intensive parallel applications running under the OpenCL, OpenACC, and OpenMP 4 target offloading APIs.
The term "Persons" in the policy includes both individuals and organizations, including, but not limited to, all commercial entities and corporations. The policy also applies to members of the press and any third party who desires to report, in a factual manner, the use of SPEC's products and services by another.
In practical terms, compliance with the policy consists of the use of the SPEC marks within the following guidelines:
How to use Trademark Symbols
All SPEC Marks must have a space between the SPEC mark's registration status symbol and year, e.g., The SPEC virt_sc® 2013 benchmark suites are used to measure performance of virtualized platforms.
In letters, memos, press releases, white papers, advertising, slides, foils, video, and other multimedia presentations:
Properly designate (with ®, ™ or SM) all of SPEC's marks at the most prominent use (usually a headline) and again on the first occurrence in copy
In the case of presentation graphics, marks should be designated with the proper mark registration status symbol on each page and slide
In newsletters, magazines, and publications containing multiple articles:
Properly designate (with ®, ™ or SM) all of SPEC's marks on the first occurrence in the Table of Contents, in headlines and on the first occurrence in every article in which they are used
In brochures, annual/quarterly reports, books, technical documentation, and other bound documents:
Properly designate SPEC's marks on the first occurrence in headlines and on the first occurrence in text with ®, ™ or SM.
In all charts or graphs, properly designate SPEC's marks with ®, ™ or SM, as they could be copied or pulled and used independently.
Trademark Acknowledgments
Properly footnote and acknowledge trademark and service mark ownership, preferably identifying SPEC marks as being owned by Standard Performance Evaluation Corporation (SPEC), e.g., SPEC CPU® is a registered trademark of SPEC.
SPEC's trademarks and service marks are SPEC's Intellectual property, and only SPEC may use its trademarks and service marks on goods and services.
Persons who desire to use SPEC's marks in any other manner must contact SPEC to inquire about written permission.
All SPEC marks must be properly designated with the appropriate symbol (®, ™ or SM).
Persons may not incorporate SPEC's marks into their own product names, service names, company names, logos, or in any other manner.
Persons may not adopt trademarks, service marks, product names, company names, or logos that are substantially or confusingly similar to SPEC's marks.
Persons may not license the use of SPEC's marks to others.
================================================================================================
************************************************************************************************
================================================================================================
Copy Right: http://www.spec.org/spec/copyright.html
***************************************************
SPEC Copyright Notice
The Standard Performance Evaluation Corporation (SPEC) is the author and owner of all material on this server and does reserve its rights.
Copyright © 1995 - 2024 Standard Performance Evaluation Corporation (SPEC). All rights reserved.
Permission to copy without fee all or part of this material is granted provided that (a) copies are not made or distributed for direct commercial advantage through distribution for profit of materials that are substantially all derived from SPEC materials, (b) the SPEC copyright notice appears, and (c) notice is given that copying is by permission of the Standard Performance Evaluation Corporation (SPEC).
SPEC and the "performance chart" SPEC logo are registered trademarks of the Standard Performance Evaluation Corporation.