Home / Expert Answers / Computer Science / both-test-functions-i-wrote-are-above-but-when-i-want-to-test-i-get-wrong-results-system-it-doesn-39-pa232

(Solved): Both test functions I wrote are above, but when I want to test I get wrong results. System It doesn' ...



Both test functions I wrote are above, but when I want to test I get wrong results. System It doesn't accept my code. I don't know exactly where my problem is. I share the problem and my solution. 

 

def PCA(X, num_components):
    """
    Args:
        X: ndarray of size (N, D), where D is the dimension of the data,
           and N is the number of datapoints
        num_components: the number of principal components to use.
    Returns:
        the reconstructed data, the sample mean of the X, principal values
        and principal components
    """
    N, D = X.shape
    # YOUR CODE HERE
    # your solution should take advantage of the functions you have implemented above.
    ### Uncomment and modify the code below
    # first perform normalization on the digits so that they have zero mean and unit variance
    X_normalized, mean = normalize(X) # EDIT THIS
    # Then compute the data covariance matrix S
    S = np.cov(X_normalized, rowvar = False, bias = True) # EDIT THIS

    # Next find eigenvalues and corresponding eigenvectors for S
    eig_vals, eig_vecs = eig(S)
    # Take the top `num_components` of eig_vals and eig_vecs,
    # This will be the corresponding principal values and components
    principal_vals, principal_components = eig_vals[:num_components], eig_vecs[:,:num_components]
    principal_components = np.real(principal_components)
    principal_components = np.real(principal_components)


    # reconstruct the data from the using the basis spanned by the principal components
    # Notice that we have subtracted the mean from X so make sure that you add it back
    # to the reconstructed data
    P = projection_matrix(principal_components)
    reconst = (P @ X_normalized.T).T + mean
    return reconst, mean, principal_vals, principal_components

 

and 

 

 

def PCA_high_dim(X, num_components):
    """Compute PCA for small sample size but high-dimensional features. 
    Args:
        X: ndarray of size (N, D), where D is the dimension of the sample,
           and N is the number of samples
        num_components: the number of principal components to use.
    Returns:
        X_reconstruct: (N, D) ndarray. the reconstruction
        of X from the first `num_components` pricipal components.
    """
    # YOUR CODE HERE
    # Uncomment and modify the code belo
    N, D = X.shape
#     # Normalize the dataset
    #X_normalized, mean = normalize(X)
    X_normalized, mean = normalize(X)

#     # Find the covariance matrix
    M = np.dot(X_normalized, X_normalized.T) / N
    #S = np.cov(X_normalized.T, rowvar=False, bias=True)
    #cov = np.cov(X_centered, rowvar=False)
    eig_vals, eig_vecs = eig(M)

#     # Next find eigenvalues and corresponding eigenvectors for S
#     # Make sure that you only take the first D eigenvalues/vectors
#     # You can also take a look at the eigenvalues beyond column (D-1) and they should be 
#     # zero (or a very small number due to finite floating point precision)
#     eig_vals, eig_vecs = None, None
    principal_values = eig_vals[:num_components]
    principal_components = eig_vecs[:, :num_components]

#     # Compute the eigenvalues and eigenvectors for the original system
#     # eig_vecs = None
    #eig_vals, eig_vecs = eig(S)
    #eig_vals = eig_vals[0:D]
    #eig_vecs = eig_vecs[:, 0:D]
   
    # Normalize the eigenvectors to have unit-length
#     # Take the top `num_components` of the eigenvalues / eigenvectors
#     # as the principal values and principal components
#     principal_values = None
#     principal_components = None
    principal_components = np.real(principal_components)

   
   # Due to precision errors, the eigenvectors might come out to be complex, so only take their real parts
#     principal_components = np.real(principal_components)
    #principal_values = eig_vals[:num_components]
    #principal_components = eig_vecs[:, :num_components]

#     # reconstruct the images from the lower dimensional representation
#     # Remember to add back the sample mean
#     reconst = None
#     return reconst, mean, principal_values, principal_components
    #P = projection_matrix(principal_components)
   # reconst = (P @ X_normalized.T).T + mean
   # return reconst, mean, principal_values, principal_components
    reconst = (projection_matrix(principal_components)@ X_normalized) + mean 
    return reconst, mean, principal_values, principal_components

???????here my error what am ? mss??

 

 

ERROR: test_PCA (week4_tests.Test) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/autograde_5q4slzm7/week4_tests.py", line 185, in test_PCA np.testing.assert_allclose(result[0], expected[0]) TypeError: 'PCA' object is not subscriptable



We have an Answer from Expert

View Expert Answer

Expert Answer


There are a few things that you could try to debug your code. Here are some suggestions: Make sure that you are calling the correct test function. You
We have an Answer from Expert

Buy This Answer $5

Place Order

We Provide Services Across The Globe