[Federal Register Volume 74, Number 199 (Friday, October 16, 2009)]
[Notices]
[Pages 53243-53247]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: E9-24992]


=======================================================================
-----------------------------------------------------------------------

FEDERAL TRADE COMMISSION


Agency Information Collection Activities; Submission for OMB 
Review and Reinstatement of Existing Collection; Comment Request

AGENCY: Federal Trade Commission (``FTC'' or ``Commission'').

ACTION: Notice and request for comment.

-----------------------------------------------------------------------

SUMMARY: The FTC plans to conduct a national study of the accuracy of 
consumer reports in connection with Section 319 of the Fair and 
Accurate Credit Transactions Act of 2003, Pub. L.108-159 (2003). This 
study is a follow-up to the Commission's two previous pilot studies 
that were undertaken to evaluate a potential design for a national 
study.\1\  This is the second of two notices required under the 
Paperwork Reduction Act (``PRA''), and the Commission seeks additional 
public comments on its proposed national study before requesting Office 
of Management and Budget (``OMB'') review of, and clearance for, the 
collection of information discussed herein.
---------------------------------------------------------------------------

    \1\ Reports to Congress Under Sections 318 and 319 of the Fair 
and Accurate Credit Transactions Act of 2003, Federal Trade 
Commission, December 2006 and 2008. The reports may be accessed at 
the FTC's Web site. December 2006 Report: (http://www.ftc.gov/reports/FACTACT/FACT_Act_Report_2006.pdf); December 2008 Report: 
(http://www.ftc.gov/opa/2008/12/factareport.shtm).

---------------------------------------------------------------------------
DATES: Comments must be received on or before November 16, 2009.

ADDRESSES: Interested parties are invited to submit written comments 
electronically or in paper form, by following the instructions in the 
Request for Comments to 30-Day Notice part of the SUPPLEMENTARY 
INFORMATION section below. Comments in electronic form should be 
submitted by using the following Web link: (https://secure.commentworks.com/ftc/FACTA319studypra2) (and following the 
instructions on the web-based form). Comments in paper form should be 
mailed or delivered to the following address: Federal Trade Commission, 
Office of the Secretary, Room H-135 (Annex J), 600 Pennsylvania Avenue, 
NW, Washington, DC 20580, in the manner detailed in the SUPPLEMENTARY 
SECTION below.

FOR FURTHER INFORMATION CONTACT: Peter Vander Nat, Economist, (202) 
326-3518, Federal Trade Commission, Bureau of Economics.

SUPPLEMENTARY INFORMATION: Under the PRA, 44 U.S.C. 3501-3520, federal 
agencies must obtain approval from OMB for each collection of 
information they conduct or sponsor. ``Collection of information'' 
means agency requests or requirements that members of the public submit 
reports, keep records, or provide information to a third party. 44 
U.S.C. 3502(3); 5 CFR 1320.3(c).
    On July 20, 2009, the FTC sought comment on the information 
collection requirements associated with the proposed national study.\2\ 
 As discussed below under (II)(D) - Summary of and Response to Public 
Comments to 60-Day Notice, three comments were received (see (http://www.ftc.gov/os/comments/facta319study/index.shtm) for text of the 
comments). Pursuant to the OMB regulations, 5 CFR Part 1320, that 
implement the PRA, the FTC is providing this second opportunity for 
public comment while seeking OMB approval to reinstate the clearance 
for the proposed national study, which is a follow-up to the FTC's two 
prior pilot studies (OMB Control No. 3084-0133) that were undertaken to 
evaluate a potential design for a national study. All comments should 
be filed as prescribed in the ADDRESSES section above and in the 
Request for Comments to 30-Day Notice (found below at II.E.), and must 
be received on or before November 16, 2009.
---------------------------------------------------------------------------

    \2\ 74 FR 35191.
---------------------------------------------------------------------------

I. Background

    Section 319 of the Fair and Accurate Credit Transactions Act of 
2003 (``FACT Act'' or the ``Act''), Pub. L.108-159 (2003) requires the 
FTC to study the accuracy and completeness of information in consumers' 
credit reports and to consider methods for improving the accuracy and 
completeness of such information. Section 319 of the Act also requires 
the Commission to issue a series of biennial reports to Congress over a 
period of eleven years. The first report was submitted to Congress in 
December 2004.\3\  The second report was submitted to Congress in 
December 2006 (``December 2006 Report''), describing the results of a 
pilot study. The third report was submitted in December 2008 
(``December 2008 Report''), describing the results of a second pilot 
study.
---------------------------------------------------------------------------

    \3\ Report to Congress Under Sections 318 and 319 of the Fair 
and Accurate Credit Transactions Act of 2003, Federal Trade 
Commission, December 2004. The December 2004 Report is available at 
(http://www.ftc.gov/reports/index.htm#2004).
---------------------------------------------------------------------------

    In July 2005, OMB approved the FTC's request to conduct a pilot 
study to evaluate the feasibility of a methodology that involves direct 
review by consumers of the information in their credit reports (OMB 
Control Number 3084-0133),\4\  and the FTC conducted that pilot study 
in 2005-2006. As explained in the December 2006 report, FTC staff 
concluded that it was necessary to conduct a second pilot study to 
evaluate additional design elements prior to carrying out a nationwide 
survey. Upon receiving further OMB approval (reinstatement of Control 
No. 3084-0133), the FTC conducted the second pilot study in 2007-2008. 
The FTC's pilot studies used small samples and did not rely on the 
selection of a nationally representative sample of credit reports; 
accordingly, no statistical projections were made. The FTC now plans to 
conduct a national study of the accuracy of consumer reports in 
connection with Section 319 of the Fair and Accurate Credit 
Transactions Act of 2003, Pub. L.108-159 (2003). This study is a 
follow-up to the Commission's two previous pilot studies.
---------------------------------------------------------------------------

    \4\ See 70 FR 24583 (May 10, 2005) for discussion of the initial 
pilot study and related public comments.
---------------------------------------------------------------------------

A. Initial Pilot Study (2005-2006)

    The goal of the initial pilot study was to assess the feasibility 
of directly engaging consumers in an in-depth review of their credit 
reports for the purpose of identifying alleged material errors and 
channeling such errors through the Fair Credit Report Act

[[Page 53244]]

(``FCRA'') dispute resolution process. The FTC's contractor for the 
initial pilot study - a research team comprised of members from the 
Center for Business and Industrial Studies (University of Missouri-St 
Louis), Georgetown University Credit Research Center, and the Fair 
Isaac Corporation - engaged 30 randomly selected participants in an in-
depth review of their credit reports. Study participants obtained their 
credit reports and credit scores\5\  from each of the three nationwide 
consumer reporting agencies (Equifax, Experian, TransUnion - 
hereinafter, the ``CRAs''). The contractor reviewed these credit 
reports with the participants and after an evaluation of alleged errors 
for materiality by the research team, consumers were asked to channel 
disputed information through the FCRA dispute resolution process.\6\ 
---------------------------------------------------------------------------

    \5\ A credit score is a numerical summary of the information in 
a credit report and is designed to be predictive of the risk of 
default. Credit scores are created by proprietary formulas that 
render the following result: the higher the credit score, the lower 
the risk of default. The contractor in the first and second pilot 
studies employed (and the proposed national study expects to employ) 
a score that is commonly used in credit reporting, namely a FICO 
score.
    \6\ The FCRA dispute resolution process involves the review of 
disputed items by data furnishers and CRAs. The formal dispute 
process renders a specific outcome for each alleged error. By direct 
instruction of the data furnisher, the following outcomes may occur: 
delete the item, change or modify the item (specifying the change), 
or maintain the item as originally reported. A CRA may also delete a 
disputed item due to expiration of the statutory time frame (the 
FCRA limits the process to 30 days, but the time may be extended to 
45 days if a consumer submits relevant information during the 30-day 
period). These possible actions are tracked by a form called 
``Online Solution for Complete and Accurate Reporting'' (e-OSCAR) 
that is used by CRAs for resolving FCRA disputes. A consumer may 
also dispute information directly with a data furnisher, as provided 
for by FCRA 623(a)(8). 15 U.S.C.1681s-2(a)(8). (See also, Federal 
Trade Commission and Board of Governors of the Federal Reserve 
System, Report to Congress on the Fair Credit Reporting Act Dispute 
Process, August 2006. The report is available at (http://www.ftc.gov/reports/index.htm#2006).
---------------------------------------------------------------------------

    The first pilot study demonstrated the general feasibility of the 
consumer interview methodology, but also revealed several challenges 
for a national study.\7\  Challenges include identifying methods for 
achieving a more representative sampling frame, increasing the response 
rates, and easing the burden of completing the study. Compared to the 
national average for credit scores, consumers with relatively low 
scores were under-represented. Also, the majority of participants who 
alleged errors on their credit reports and indicated that they would 
file a dispute did not follow through with their stated intention to 
file. In consideration of these and other matters, the FTC conducted a 
follow-up pilot study.
---------------------------------------------------------------------------

    \7\ The FTC's December 2006 Report to Congress contains a more 
detailed review of the study and its results.
---------------------------------------------------------------------------

B. The Second Pilot Study (2007-2008)

    The second pilot study combined successful elements from the first 
pilot with new procedures designed to overcome shortcomings of the 
first pilot.
    Through a variety of recruitment channels, 4,232 people were 
invited to participate. Multiple recruitment methods were employed and 
these were useful in identifying differences in response rates and 
credit scores of the respondents across various methods of recruitment. 
Of the 4,232 individuals contacted, 128 (3%) became participants. The 
contractor \8\  helped participants obtain their 3 credit reports and 
conducted an in-depth review of the reports with each participant. The 
contractor also helped the participants to identify alleged 
inaccuracies and gave advice on the difference between a small 
inaccuracy and a material error that is likely to affect a credit 
score. Specific criteria for materiality were developed in consultation 
with Fair Isaac's analyst on the research team.\9\  If the consumer 
alleged a material error, the individual was encouraged to file a 
formal FCRA dispute so as to obtain a review of the challenged items by 
data furnishers and CRAs. The contractor prepared a dispute letter for 
any consumer who wanted to file and allege an error, material or not 
(as the FCRA permits a consumer to dispute any credit report 
information that the person believes to be inaccurate).
---------------------------------------------------------------------------

    \8\ Due to the similarity in design (i.e., second pilot was 
constructed as a follow-up to first) the FTC employed the same 
contractor.
    \9\ December 2008 Report (at 3). The contractor used the 
following criteria for materiality: the consumer had a credit score 
less than 760 (a cutoff widely used to identify consumers with 
lowest credit risk and for extending credit on most favorable terms) 
AND the consumer alleged an error regarding any of the following 
matters: (i) negative items (such as late payments); (ii) public 
derogatories (such as bankruptcy); (iii) accounts sent to 
collection; (iv) number of inquiries for new credit; (v) outstanding 
balances not attributable to normal monthly reporting variation; 
(vi) accounts on the report not belonging to the person who is the 
subject of the report; or (vii) duplicate entries of the same 
information (e.g., late payments or outstanding obligations) that 
were double-counted in the reported summaries of such items. To 
enhance the efficiency of the study process, the stated criteria 
modify somewhat the procedure used in the first pilot study 
(contractor's report on second pilot study at 27). In the proposed 
national study, we do not intend to use any cutoff score for 
materiality, but plan to retain the stated categories as indicating 
a dispute material to creditworthiness.
---------------------------------------------------------------------------

    Regarding the results of the study, 88 of the 128 participants 
(69%) found no errors in their credit reports. Of the 40 participants 
who alleged one or more errors that they wanted to dispute, 15 (or 12% 
of the 128) alleged a material error. For 7 of these latter cases, the 
FCRA dispute process rendered credit report changes that were made 
fully in keeping with all of the consumer's allegations.\10\ 
---------------------------------------------------------------------------

    \10\ Other cases (i.e., some of the consumer's allegations were 
confirmed while other allegations were denied) are summarized in the 
December 2008 Report (at 2 & 8).
---------------------------------------------------------------------------

    As noted above, the second pilot study (like the first) used a 
small sample and no statistical projections were made. Accordingly, no 
extensive statistical summaries were needed, nor were any given, in the 
FTC's report on the study. The primary purpose of the pilot studies was 
to refine the expert-assisted survey approach for studying credit 
report information, in preparation for a national study.
    The second pilot study confirmed the importance of having the 
contractor prepare dispute letters for consumers. This was not done in 
the first pilot study. In the first pilot study, only 1 of the 3 
participants who alleged material errors on their credit reports filed 
a dispute. In the follow-up pilot study, all 15 of the participants who 
alleged material errors on their credit reports received dispute 
letters from the contractor, and the outcomes of these disputes are 
known for 12 of them. This is a significant improvement over the first 
pilot study.
    As noted above, multiple recruitment methods were used to identify 
differences in response rates and in credit scores of respondents 
across various methods of recruitment. The second pilot study confirmed 
the difficulties of obtaining adequate numbers of participants with 
below-average credit scores. Purely random sampling of potential 
participants yielded too few actual participants with low credit 
scores.\11\  A weighted random sampling approach, whereby more 
invitations were extended to groups of consumers who were likely to 
have lower credit scores, produced a sample closer to national 
norms.\12\ 
---------------------------------------------------------------------------

    \11\ Table III of the December 2008 Report (at 9).
    \12\ Table 9 of the contractor's report (appendix to the 
December 2008 Report).
---------------------------------------------------------------------------

    The second pilot study indicated that it would be feasible to base 
a measure of the accuracy of credit report information on confirmed 
material errors via the FCRA dispute process. Whenever it appeared that 
a consumer's credit score could be affected by ``correcting'' an 
alleged material error, the contractor marked the credit reports (the 
frozen files)\13\  with explanations of

[[Page 53245]]

the discrepancies and sent copies of the marked reports to Fair Isaac 
for rescoring. If, via the FCRA dispute process, changes were 
subsequently made by CRAs and lenders in keeping with the consumer's 
allegations, these changed items were then designated as confirmed 
material errors. The frozen file would then be re-scored to quantify 
the impact of the confirmed error(s) on the consumer's credit score. 
The difference between the rescore of the frozen file and the original 
score would be a meaningful measure of the impact of inaccurate credit 
report information. We intend to use this type of methodology in a 
national study.\14\ 
---------------------------------------------------------------------------

    \13\ The files are called ``frozen'' because no new credit 
information was added to the consumer's original credit reports 
obtained in the study; any rescoring would thus apply only to 
potential changes or actual changes that were directly related to 
the contractor's review.
    \14\ Certain limitations regarding this methodology are 
discussed in the December 2008 Report (at 3 & 4). Yet, use of the 
FCRA dispute process appears to be the only feasible way of 
performing a nationwide survey, in view of the enormous difficulty 
and cost of attempting to ascertain the ultimate accuracy regarding 
alleged errors.
---------------------------------------------------------------------------

    As a final point of this summary of the pilot studies, the 
relatively low response rate (i.e., approximately 3% of the individuals 
contacted became participants) raises concern for the design of a 
national study regarding a potential response bias. This matter is 
addressed below.

II. Proposed National Study

A. Description of the Collection of Information and the Proposed Use

    The proposed national study seeks to use a large representative 
sample of credit reports so that we may draw inferences, up to a 
certain level of statistical confidence, about the accuracy of credit 
reports in general. The need to employ a representative sample makes 
the initial steps of the proposed study different from the methodology 
of the second pilot study; in other respects, the methodologies of the 
two studies are largely the same. Our goal is to obtain approximately 
1,000 participants who as a group display a diversity on credit scores 
and on major demographic characteristics in line with national norms.
    The relevant population for the study is comprised of adult members 
of households who have credit histories with Equifax, Experian, and/or 
TransUnion. To study these credit histories we propose, as a first 
step, to obtain a very large random sample (with an order of magnitude 
of 200,000 names) from one of the consumer reporting agencies in order 
to determine a set of individuals selected for possible contact (the 
``SPC list'').\15\  From this SPC list, FTC staff will draw a further 
and considerably smaller random sample (e.g., 10% sample) of 
individuals selected for contact (the ``SC list'').
---------------------------------------------------------------------------

    \15\ The information in this sample, which would include names, 
addresses, and credit scores, is to be obtained under applicable law 
and protected from disclosure by, e.g., Exemption 6 of the Freedom 
of Information Act, 5 U.S.C. 552. That information, as well as any 
credit reports that individual participants give permission to be 
analyzed for the study, will be maintained and used by the FTC and 
its contractors subject to appropriate information security 
procedures and safeguards (e.g., maintaining credit-related data 
separately from personal identifying information, requiring the 
FTC's contractors to execute confidentiality agreements, and 
limiting access to those FTC and contractor staff who have a need to 
work with the data). As noted above, the study methodology is also 
designed to prevent disclosure of any individual's participation in 
the study to any credit reporting agency.
---------------------------------------------------------------------------

    There are several reasons for this two-step process. First, the 
vast majority of the names on the SPC list will not be sent invitations 
to participate and thus helps ensure that no CRA will know who is 
participating in the study. Further, using the SC list, we plan to send 
proportionally more invitation letters to individuals with lower credit 
scores. Use of this weighted random sampling approach is designed to 
obtain an ultimate set of participants having credit scores 
(specifically, the lower scores) in line with national norms, as 
suggested by the results of the second pilot study.\16\ 
---------------------------------------------------------------------------

    \16\ December 2008 Report (at 9 &10).
---------------------------------------------------------------------------

    After some substantial set of individuals have agreed to join the 
study (300 - 400 people), we will have an initial sample. This sample 
will be compared with the larger SPC list on credit scores and 
geographic diversity. Statistically significant differences between 
this initial sample and the larger SPC list would reflect the impact of 
non-participation. From this information, we can selectively draw 
individuals from the SC list in an effort to compensate for these 
differences as necessary.
    As a further check on a potential bias in the decision to 
participate, we plan to obtain anonymized (redacted) credit reports 
(and related credit scores)\17\  for the entire class of non-
respondents, i.e., all the people from the SC list who choose not to 
participate. Using the redacted reports and related scores we can 
determine, for example, whether non-respondents had significantly 
different credit scores or significantly different credit histories 
from those who agreed to participate.
---------------------------------------------------------------------------

    \17\ These credit reports and scores will be generated and 
maintained without name, address or personal identifiers other than 
ID numbers assigned by the study.
---------------------------------------------------------------------------

    Upon completion of the study, we will have a database with detailed 
demographic information about the participants, the type and quantity 
of alleged material errors on their credit reports, the type and 
quantity of confirmed material errors via the FCRA dispute process, and 
the impact of any such confirmed errors on the participants' credit 
scores.\18\  Further, by analyzing the redacted credit reports and 
related scores of the non-respondents, we obtain a final check on the 
degree to which the enhanced procedures were effective in achieving a 
nationally representative sample of credit reports.
---------------------------------------------------------------------------

    \18\ Using the methodology of the pilot studies, we expect to 
obtain a variety of alleged errors: incorrect report of late 
payment; multiple reports of an account with late payment; paid 
account reported as delinquent; closed account reported as 
delinquent; incorrect financial account reported (``not mine''); 
incorrect collection balance; incorrect collection account reported; 
multiple reports of an account in bankruptcy; chapter 7 accounts 
discharged but reported as delinquent, as well as further types of 
alleged errors. For these same categories we can also tabulate 
confirmed material errors via the FCRA dispute process. As explained 
above, the rescoring of the frozen files will then provide the 
impact of any confirmed errors on the participants' credit scores.
---------------------------------------------------------------------------

B. Estimated Hours Burden

    Consumer participation in the proposed national study would involve 
an initial preparation for the in-depth interview and time spent by 
participants to understand, review, and if deemed necessary, dispute 
information in their credit reports. Invitation letters will be sent in 
progressive waves in order to obtain approximately 1,000 participants. 
The individuals who receive these letters are drawn from the SC list 
discussed above and will be asked to go directly to a designated Web 
site for enrollment if they wish to participate; registration is 
expected to take at most 15 minutes per participant.\19\  The 
registration process thus comes to approximately 250 hours (reckoned at 
1/4 hour for each of 1,000 consumers).
---------------------------------------------------------------------------

    \19\ At the registration Web site, a person may take the time to 
read several disclosures, including a privacy disclosure and an 
outline of the various steps of the study that every participant 
agrees to undertake. The consumer is then asked to enter basic 
contact information (e.g., name, address, telephone number, best 
time to be contacted further about the study) and to enter an 
electronic signature certifying the consumer's consent to 
participate in the study. For those who may not have Internet access 
to register, the contractor would also have a procedure to mail the 
appropriate disclosures and study steps to the respondent and then 
receive back enrolment information and the consumer's signed consent 
in paper form.
---------------------------------------------------------------------------

    For the purpose of calculating burden under the PRA regarding the 
review process of the credit reports, FTC staff

[[Page 53246]]

submits the following estimates that are based on the contractor's 
experience with the second pilot study. Some participants prepare 
thoroughly in advance of the in-depth interview of their credit 
reports. In such situations, even complicated reports may generally be 
finished under 30 minutes. Other consumers may not find time for 
significant preparation in advance of the in-depth review, and in such 
cases the interview could take up to an hour. The participants in the 
second pilot study reported taking an average of 69 minutes (median 53 
minutes) to prepare for the interview, with 90% taking between 10 and 
180 minutes. The interviews themselves took an average of 19 minutes 
(median 15 minutes) with 90% taking between 5 and 45 minutes. Overall, 
the average combined time for preparation and the interview was about 
90 minutes (1.5 hours). For a national study involving 1,000 consumers, 
FTC staff thus estimates the burden hours for the review process to be 
approximately 1,500 hours (1,000 consumers x 1.5 hours). Further adding 
on the time spent for the registration process (0.25 hours per 
participant), the total burden hours come to approximately 1,750 hours.

C. Estimated Cost Burden

    The cost per consumer for their participation should be negligible. 
Participation is voluntary and it will not require any start-up or 
capital expenditure. There is no labor time expenditure beyond the 1.75 
hours per consumer estimated above. Participants may receive an 
honorarium to compensate them for their time. The amount will be 
determined by FTC staff in consultation with the contractor according 
to an analysis of customary procedures and a consideration of response 
rates within key categories, such as, response rates for consumers with 
impaired credit. As with the pilot studies, participants will not pay 
for their credit reports or credit scores.

D. Summary of and Response to Public Comments to Prior 60-Day Notice

    The commenters were the Consumer Data Industry Association (CDIA), 
Mr. Chris Hoofnagle of the Berkeley Center for Law & Technology, and 
Privacy Times submitted by Mr. Evan Hendricks (and signed by additional 
parties). No comments addressed the cost and hour burden estimates nor 
challenged the need or the importance of the study. Overall, the 
comments addressed the qualifications of any potential contractor, the 
universe of participants to be covered by the study, and some concerns 
about specific parts of the methodology of the study.
    The comment from the CDIA, submitted by Mr. Stuart Pratt, is 
generally supportive while expressing certain concerns. The CDIA (at 2) 
believes that the FTC's use of consumer interviews combined with the 
FCRA dispute process ``compares favorably to the flawed methodology 
employed by consumer groups in their `studies' of credit report 
accuracy.'' The CDIA recommends the FTC highlight these differences in 
its communications about the study. As discussed above, in its 2004 
Report to Congress, the FTC reviewed all prior studies and created a 
design for a national study to specifically address certain 
shortcomings of prior approaches. In an upcoming report to Congress 
about the results of the national study, the FTC will again point out 
the ways in which the study has addressed prior shortcomings.
    The CDIA (at 1) expresses the concern that the methodology may 
over-sample consumers with low credit scores; it recommends the 
ultimate study group have credit scores that ``are reflective of the 
distribution of scores in the databases of the nationwide consumer 
reporting agencies.'' FTC staff agrees with the stated recommendation. 
As discussed in the referenced FR notice of July 20, 2009 (at 35194), 
the second pilot study confirmed that purely random sampling of 
potential participants yields too few actual participants with low 
credit scores. In the national study, invitation letters will be sent 
in progressive waves, and proportionally more invitation letters will 
be sent to groups having lower credit scores. Based on our knowledge of 
the second pilot study and also the knowledge that will be gained from 
the response rates of the earlier waves of letters in the national 
study, FTC staff will be able to adjust subsequent waves of letters to 
the potential respondents in certain score ranges so as to achieve a 
total set of respondents whose credit scores are indeed in line with 
national norms. It is possible, although not very likely, that the 
methodology could render a set of respondents having too many people 
with low scores. However, since the national distribution of credit 
scores is known (with great refinement), there are recognized 
statistical procedures to ultimately correct any over-sampling of low 
scores (should it occur) and to ensure the statistical reliability of 
the results, including the reliability of the results for the 
population as a whole.\20\ 
---------------------------------------------------------------------------

    \20\ See, for example, Harnett, Donald L., Statistical Methods 
(3\rd\ ed.), Addison-Wesley Publishing Co., 1984 (pages 253-254).
---------------------------------------------------------------------------

    The CDIA (at 1) also expresses a concern for reaching out to 
consumers who do not have Internet access. As explained in the FR 
notice of July 20, 2009 (at 35195), participants will use the Internet 
to register for the study. However, for those who may not have Internet 
access, thecontractor will also have a procedure to mail the 
appropriate disclosures and the study steps tothe respondent and then 
receive enrollment information and the consumer's signed consent 
inpaper form. The in-depth review of the credit reports with the 
participants will occur over thetelephone and does not require Internet 
access.
    The CDIA (at 2) recommends that upon assessing errors by a change 
in credit score that is attributable to certain errors, the FTC also 
include measures on how a change in score wouldimpact a consumer's 
interest rate or other credit decisions; e.g., some changes in credit 
scorewould keep a consumer in the same ``band of risk'' determined by 
the lender, while other changescould place the consumer in a more 
favorable band of risk and thus allow the lender to proffer anoticeably 
better interest rate. FTC staff agrees with this recommendation. In 
reporting theresults of the study to Congress, staff fully intends to 
include the type of discussion andassociated measures here indicated.
    The comment from the Berkeley Center for Law & Technology, 
submitted by Mr. Chris Hoofnagle, strongly supports the FTC's announced 
goal of acquiring ``1000 participants who as agroup display a diversity 
on credit scores and on major demographic characteristics in line 
withnational norms.'' The commenter further recommends, regarding the 
qualifications of anypotential contractor, that the entity be highly 
qualified to perform consumer surveys and that itbe a neutral entity 
(i.e., have no stake in the outcome of the study). FTC staff readily 
concurswith the expressed concern. The FTC will publically solicit 
competitive bids for performing thestudy in keeping with a detailed 
scope of work (to be announced). Staff will carefully review 
thecredentials associated with each bid and proposal and will seek a 
contractor who is highlyqualified to perform the required work and who 
has no stake in the outcome of the study.
    The comment letter from Mr. Evan Hendricks of Privacy Times (signed 
by additional parties) covers several of the concerns noted above and 
addressed there (e.g., qualificationsof the study contractor and the 
need for a diverse set of credit scores reflective of national

[[Page 53247]]

norms). The commenter further recommends that the study pay special 
attention to the matter ofdata matching procedures, covering such 
maters as the use of Social Security Numbers and partial matches on 
consumer identifiers. The matter of data matching procedures has been 
reviewed in the 2004 Report to Congress, and staff does not anticipate 
that this study will specifically address the internal data matching 
procedures used by credit bureaus. However, the contractor will keep a 
detailed narrative regarding each participant, including specific 
errors alleged and their subsequent disposition. In tabulating the 
types of confirmed errors via the dispute process, the study will 
acquire a great deal of information on the main sources of error in 
credit reports.\21\  Further, in regard to an expressed concern from 
Mr. Hendricks about recognizing ID theft as an important source of 
error, the category of alleged error called ``not mine'' will be 
separated into the subcategories of ``mixed file'' and ``ID theft.''
---------------------------------------------------------------------------

    \21\ See referenced Federal Register Notice at 35193 (note 9) 
and at 35194 (note 18) for the types of errors to be tabulated.
---------------------------------------------------------------------------

E. Request for Comments to Current 30-Day Notice

    Interested parties are invited to submit written comments 
electronically or in paper form. Comments should refer to ``National 
Accuracy Study: Paperwork Comment (FTC file no. P044804)'' to 
facilitate the organization of comments. Please note that your comment 
-- including your name and your state -- will be placed on the public 
record of this proceeding, including on the publicly accessible FTC Web 
site, at (http://www.ftc.gov/os/publiccomments.shtm).
    Because comments will be made public, they should not include any 
sensitive personalinformation, such as an individual's Social Security 
Number; date of birth; driver's licensenumber or other state 
identification number, or foreign country equivalent; passport 
number;financial account number; or credit or debit card number. 
Comments also should not includeany sensitive health information, such 
as medical records or other individually identifiable health 
information. In addition, comments should not include any ``[t]rade 
secret or any commercial orfinancial information which is obtained from 
any person and which is privileged or confidential . . .,'' as provided 
in Section 6(f) of the Federal Trade Commission Act (``FTC Act''), 15 
U.S.C. 46(f), and Commission Rule 4.10(a)(2), 16 CFR 4.10(a)(2). 
Comments containing material for which confidential treatment is 
requested must be filed in paper form, must be clearly labeled 
``Confidential,'' and must comply with FTC Rule 4.9(c).\22\ 
---------------------------------------------------------------------------

    \22\ The comment must be accompanied by an explicit request for 
confidential treatment, including the factual and legal basis for 
the request, and must identify the specific portions of the comment 
to be withheld from the public record. The request will be granted 
or denied by the Commission's General Counsel, consistent with 
applicable law and the public interest. See FTC Rule 4.9(c), 16 CPR 
4.9(c).
---------------------------------------------------------------------------

    Because paper mail addressed to the FTC is subject to delay due to 
heightened security screening, please consider submitting your comments 
in electronic form. Comments filed in electronic form should be 
submitted by using the following Web link: (https://secure.commentworks.com/ftc/FACTA319studypra2) (and following the 
instructions on the web-based form). To ensure that the Commission 
considers an electronic comment, you must file it on the web-based form 
at the Web link: (https://secure.commentworks.com/ftc/FACTA319studypra2). If this Notice appears at (http://www.regulations.gov), you may also file an electronic comment through 
that Web site. The Commission will consider all comments that 
regulations.gov forwards to it.
    A comment filed in paper form should include the ``National 
Accuracy Study: Paperwork Comment (FTC file no. P044804)'' reference 
both in the text and on the envelope, and should be mailed or delivered 
to the following address: Federal Trade Commission, Office of the 
Secretary, Room H-135 (Annex J), 600 Pennsylvania Avenue, NW, 
Washington, DC 20580. The FTC is requesting that any comment filed in 
paper form be sent by courier or overnight service, if possible, 
because U.S. postal mail in the Washington area and at the Commission 
is subject to delay due to heightened security precautions.
    Comments on any proposed filing, recordkeeping, or disclosure 
requirements that are subject to paperwork burden review under the 
Paperwork Reduction Act (``PRA'') should additionally be submitted to: 
Office of Information and Regulatory Affairs, Office of Management and 
Budget (``OMB''), Attention: Desk Officer for Federal Trade Commission. 
Comments should be submitted via facsimile to (202) 395-5167 because 
U.S. postal mail at the OMB is subject to delays due to heightened 
security precautions.
    The FTC Act and other laws the Commission administers permit the 
collection of publiccomments to consider and use in this proceeding as 
appropriate. The Commission will considerall timely and responsive 
public comments that it receives, whether filed in paper or electronic 
form. Comments received will be available to the public on the FTC's 
Web site, to the extentpracticable, at (http://www.ftc.gov/os/publiccomments.shtm). As a matter of discretion, the Commission makes 
every effort to remove home contact information for individuals from 
thepublic comments it receives before placing those comments on the 
FTC's Web site. More information, including routine uses permitted by 
the Privacy Act, may be found in the FTC'sprivacy policy, at (http://www.ftc.gov/ftc/privacy.shtm).

David C. Shonka,
Acting General Counsel.
[FR Doc. E9-24992 Filed 10-15-09; 8:45 am]
BILLING CODE 6750-01-S