2 Replies Latest reply on Nov 16, 2016 7:08 AM by Ian Conlon

    Schema.ini: No TextDelimiter

    Ian Conlon

      We're using tab-delimited text files as our primary dashboard data sources. In the interest of getting Tableau to import the files in the correct format, we're using a schema.ini file (per this suggestion). For the most part, the schema file works perfectly--fields that are meant to be string come in as string, while those that are numeric come in as numeric. The one problem has been with the text-type fields in our files, which do NOT have a text qualifier. Tableau's default setting of a double quote (") as the text qualifier causes our files to import incorrectly, as some of our text fields include double quotes within them.


      What is the correct way to indicate in a schema.ini file that there is no text qualifier? I've tried the following, all to no avail:






      The first or second version cause no issue, but when the file is imported into Tableau, Tableau assigns the double quote (") as the text delimiter. The third version causes the import process to fail.


      For a little more context, this is how the top of each table schema is laid out:









      ...the files then go on to specify each column name and data type.


      Any help would be greatly appreciated!

        • 1. Re: Schema.ini: No TextDelimiter
          Cheryl Grinds

          Hi Ian,

          Based on the example you provided, it looks like you're using a tab delimited file.  What happens if you use the Schema.ini file without including the TextDelimiter line?



          • 2. Re: Schema.ini: No TextDelimiter
            Ian Conlon

            Hi Cheryl,


            Not sure why it worked, but your suggestion seems to have fixed the problem. For a little background, I came across the option for "TextDelimiter=" in a Tableau forum comment and figured I should include it in my own schema.ini file. But removing it seems to have done the trick. I would have thought Tableau would simply ignore an option that wasn't valid, but apparently not. Very strange.


            I'll keep monitoring to see if the problem returns, but for the time being, we can consider this issue solved.