Add like
Add dislike
Add to saved papers

Two distinct parsing stages in nonword reading aloud: Evidence from Russian.

Word reading partly depends on the activation of sublexical letter clusters. Previous research has studied which types of letter clusters have psychological saliency, but less is known about cognitive mechanisms of letter string parsing. Here, we take advantage of the high degree of context-dependency of the Russian orthography to examine whether consonant-vowel (CV) clusters are treated as units in two stages of sublexical processing. In two experiments using a nonword reading task, we use two orthogonal manipulations: (a) insertion of a visual disruptor (#) to assess whether CV clusters are kept intact during the early visual parsing stage, and (b) presence of context-dependent grapheme-phoneme correspondences (GPCs; e.g., л[а] → /l/; л[я] → /lj /), to assess whether CV clusters remain intact or are split during the print-to-speech conversion stage. The results suggest that although CV clusters are initially processed as perceptual units in the early visual parsing stage, letters and not CV clusters drive print-to-speech conversion.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app